AASD4015 - Advanced Mathematical Concepts for Deep Learning¶

Project: Analyze Residual Blocks & Upsampling Blocks for Enhanced Deep Residual Networks¶

Team Members:¶

  • Khandaker Nahid Mahmud (101427435)
  • Siddhant Gite (101359755)

Problem Statement:¶

Image super-resolution (SR), particularly single image super-resolution (SISR) aims to reconstruct a high-resolution image from a single low-resolution image. Recent research on super-resolution has progressed with the development of deep convolutional neural networks (DCNN). In particular, residual learning techniques exhibit improved performance.

In the paper Enhanced Deep Residual Networks for Single Image Super-Resolution (EDSR) Bee Lim et. al proposed the EDSR architecture, which is based on the SRResNet architecture.

In this project we implement the base single-scale model proposed in the paper and try to study the effect of Residual blocks and Upsampling blocks on image quality and training time. We analyzed the performance by changing the following parameters:

  • Number of Residual blocks
  • Types of Upsampling blocks - Sub-Pixel CNN, Conv2DTranspose & UpSampling2D

Introduction¶

The EDSR architecture is based on the SRResNet architecture and consists of multiple residual blocks. It uses constant scaling layers instead of batch normalization layers to produce consistent results.

The residual block design of EDSR also differs from that of ResNet. Batch normalization layers have been removed (together with the final ReLU activation), since batch normalization layers normalize the features, they hurt output value range flexibility. It is thus better to remove them. Further, it also helps to reduce the amount of GPU RAM required by the model, since the batch normalization layers consume the same amount of memory as the preceding convolutional layers.

Reference: https://sh-tsang.medium.com/review-edsr-mdsr-enhanced-deep-residual-networks-for-single-image-super-resolution-super-4364f3b7f86f

Modified Residual Blocks

Apart from the residual blocks, another component of the architecture are the upsampling blocks. There are several strategies for upsampling and we explored the following:

  1. Sub-Pixel CNN
  2. Conv2DTranspose
  3. UpSampling2D

We learned the concepts from the following articles and videos:

https://www.analyticsvidhya.com/blog/2021/05/deep-learning-for-image-super-resolution/
http://krasserm.github.io/2019/09/04/super-resolution/
https://www.youtube.com/watch?v=fMwti6zFcYY&ab_channel=DigitalSreeni
https://towardsdatascience.com/types-of-convolutions-in-deep-learning-717013397f4d

1. Sub-Pixel CNN:¶

Given an input of size H×W×C and an upsampling factor s, the sub-pixel convolution layer first creates a representation of size H×W×s2C via a convolution operation and then reshapes it to sH×sW×C, completing the upsampling operation. The result is an output spatially scaled by factor s.

2. Transposed convolution:¶

Transposed convolution layer, tries to perform transformation opposite a normal convolution, i.e. predicting the possible input based on feature maps sized like convolution output. Specifically, it increases the image resolution by expanding the image by inserting zeros and performing convolution.

3. Upsampling2D:¶

UpSampling2D is just a simple scaling up of the image by using nearest neighbour or bilinear upsampling. Advantage is it's cheap

The performance of the models are measured by the quality of generated images. To quantify reconstruction quality for images, Peak signal-to-noise ratio (PSNR) is measured and compared.

Dataset¶

DIV2K:¶

We used the same dataset used in the paper - DIV2K. This is a popular single-image super-resolution dataset which contains 1,000 images with different scenes and is splitted to 800 for training, 100 for validation and 100 for testing. This dataset contains low resolution images with different types of degradations. Apart from the standard bicubic downsampling, several types of degradations are considered in synthesizing low resolution images for different tracks of the challenges.

The dataset is available in tensorflow datasets: https://www.tensorflow.org/datasets/catalog/div2k

Summary of Findings¶

In our experiment Transposed convolution achieved better results than the default Sub pixel convoluion method. But it takes more time to train. Interestingly with only 2 residual blocks the simple Upsampling2D method achieved very good results in terms of both time and quality.

Reference Notebook:¶

The EDSR notebook from Keras website is taken as reference to implement the base model. Then we experminted with various combinations of Rasidual Blocks and Upsampling Blocks.

https://keras.io/examples/vision/edsr/

Environment¶

The notebook was implemented on a g4dn.xlarge EC-2 instance with 4 vCPU, 16GB RAM, 1 GPU with 16GB GPU RAM. An AWS Deep Learning AMI (DLAMI) with Ubuntu operating system was used to take advantage of the NVIDIA T4 GPU.

Imports¶

In [1]:
import numpy as np
import tensorflow as tf
import tensorflow_datasets as tfds
import matplotlib.pyplot as plt

from tensorflow import keras
from tensorflow.keras import layers

AUTOTUNE = tf.data.AUTOTUNE

Download the training dataset¶

We use 4x bicubic downsampled images as our "low quality" reference from the tensorflow DIV2K dataset

In [2]:
# Download DIV2K from TF Datasets
# Using bicubic 4x degradation type
div2k_data = tfds.image.Div2k(config="bicubic_x4")
div2k_data.download_and_prepare()

# Taking train data from div2k_data object
train = div2k_data.as_dataset(split="train", as_supervised=True)
train_cache = train.cache()
# Validation data
val = div2k_data.as_dataset(split="validation", as_supervised=True)
val_cache = val.cache()
Downloading and preparing dataset 3.97 GiB (download: 3.97 GiB, generated: Unknown size, total: 3.97 GiB) to /root/tensorflow_datasets/div2k/bicubic_x4/2.0.0...
EXTRACTING {'train_lr_url': 'https://data.vision.ee.ethz.ch/cvl/DIV2K/DIV2K_train_LR_bicubic_X4.zip', 'valid_lr_url': 'https://data.vision.ee.ethz.ch/cvl/DIV2K/DIV2K_valid_LR_bicubic_X4.zip', 'train_hr_url': 'https://data.vision.ee.ethz.ch/cvl/DIV2K/DIV2K_train_HR.zip', 'valid_hr_url': 'https://data.vision.ee.ethz.ch/cvl/DIV2K/DIV2K_valid_HR.zip'}
Dataset div2k downloaded and prepared to /root/tensorflow_datasets/div2k/bicubic_x4/2.0.0. Subsequent calls will reuse this data.

Flip, crop and resize images¶

In [3]:
def flip_left_right(lowres_img, highres_img):
    """Flips Images to left and right."""

    # Outputs random values from a uniform distribution in between 0 to 1
    rn = tf.random.uniform(shape=(), maxval=1)
    # If rn is less than 0.5 it returns original lowres_img and highres_img
    # If rn is greater than 0.5 it returns flipped image
    return tf.cond(
        rn < 0.5,
        lambda: (lowres_img, highres_img),
        lambda: (
            tf.image.flip_left_right(lowres_img),
            tf.image.flip_left_right(highres_img),
        ),
    )


def random_rotate(lowres_img, highres_img):
    """Rotates Images by 90 degrees."""

    # Outputs random values from uniform distribution in between 0 to 4
    rn = tf.random.uniform(shape=(), maxval=4, dtype=tf.int32)
    # Here rn signifies number of times the image(s) are rotated by 90 degrees
    return tf.image.rot90(lowres_img, rn), tf.image.rot90(highres_img, rn)


def random_crop(lowres_img, highres_img, hr_crop_size=96, scale=4):
    """Crop images.

    low resolution images: 24x24
    high resolution images: 96x96
    """
    lowres_crop_size = hr_crop_size // scale  # 96//4=24
    lowres_img_shape = tf.shape(lowres_img)[:2]  # (height,width)

    lowres_width = tf.random.uniform(
        shape=(), maxval=lowres_img_shape[1] - lowres_crop_size + 1, dtype=tf.int32
    )
    lowres_height = tf.random.uniform(
        shape=(), maxval=lowres_img_shape[0] - lowres_crop_size + 1, dtype=tf.int32
    )

    highres_width = lowres_width * scale
    highres_height = lowres_height * scale

    lowres_img_cropped = lowres_img[
        lowres_height : lowres_height + lowres_crop_size,
        lowres_width : lowres_width + lowres_crop_size,
    ]  # 24x24
    highres_img_cropped = highres_img[
        highres_height : highres_height + hr_crop_size,
        highres_width : highres_width + hr_crop_size,
    ]  # 96x96

    return lowres_img_cropped, highres_img_cropped

Prepare a tf.data.Dataset object¶

We augment the training data with random horizontal flips and 90 rotations.

As low resolution images, we use 24x24 RGB input patches.

In [4]:
def dataset_object(dataset_cache, training=True):

    ds = dataset_cache
    ds = ds.map(
        lambda lowres, highres: random_crop(lowres, highres, scale=4),
        num_parallel_calls=AUTOTUNE,
    )

    if training:
        ds = ds.map(random_rotate, num_parallel_calls=AUTOTUNE)
        ds = ds.map(flip_left_right, num_parallel_calls=AUTOTUNE)
    # Batching Data
    ds = ds.batch(16)

    if training:
        # Repeating Data, so that cardinality if dataset becomes infinte
        ds = ds.repeat()
    # prefetching allows later images to be prepared while the current image is being processed
    ds = ds.prefetch(buffer_size=AUTOTUNE)
    return ds


train_ds = dataset_object(train_cache, training=True)
val_ds = dataset_object(val_cache, training=False)

Visualize the data¶

Let's visualize a few sample images:

In [5]:
lowres, highres = next(iter(train_ds))

# High Resolution Images
plt.figure(figsize=(10, 10))
for i in range(9):
    ax = plt.subplot(3, 3, i + 1)
    plt.imshow(highres[i].numpy().astype("uint8"))
    plt.title(highres[i].shape)
    plt.axis("off")

# Low Resolution Images
plt.figure(figsize=(10, 10))
for i in range(9):
    ax = plt.subplot(3, 3, i + 1)
    plt.imshow(lowres[i].numpy().astype("uint8"))
    plt.title(lowres[i].shape)
    plt.axis("off")

Model definitions and Training¶

In the paper, the authors train three models: EDSR, MDSR, and a baseline model. In this code example, we only train the baseline model.

Peak signal-to-noise ratio (PSNR)¶

In [6]:
def PSNR(super_resolution, high_resolution):
    """Compute the peak signal-to-noise ratio, measures quality of image."""
    # Max value of pixel is 255
    psnr_value = tf.image.psnr(high_resolution, super_resolution, max_val=255)[0]
    return psnr_value
In [7]:
def plot_results(lowres, preds):
    """
    Displays low resolution image and super resolution image
    """
    plt.figure(figsize=(24, 14))
    plt.subplot(132), plt.imshow(lowres), plt.title("Low resolution")
    plt.subplot(133), plt.imshow(preds), plt.title("Prediction")
    plt.show()

Build the Model¶

In [8]:
class EDSRModel(tf.keras.Model):
    def train_step(self, data):
        # Unpack the data. Its structure depends on your model and
        # on what you pass to `fit()`.
        x, y = data

        with tf.GradientTape() as tape:
            y_pred = self(x, training=True)  # Forward pass
            # Compute the loss value
            # (the loss function is configured in `compile()`)
            loss = self.compiled_loss(y, y_pred, regularization_losses=self.losses)

        # Compute gradients
        trainable_vars = self.trainable_variables
        gradients = tape.gradient(loss, trainable_vars)
        # Update weights
        self.optimizer.apply_gradients(zip(gradients, trainable_vars))
        # Update metrics (includes the metric that tracks the loss)
        self.compiled_metrics.update_state(y, y_pred)
        # Return a dict mapping metric names to current value
        return {m.name: m.result() for m in self.metrics}

    def predict_step(self, x):
        # Adding dummy dimension using tf.expand_dims and converting to float32 using tf.cast
        x = tf.cast(tf.expand_dims(x, axis=0), tf.float32)
        # Passing low resolution image to model
        super_resolution_img = self(x, training=False)
        # Clips the tensor from min(0) to max(255)
        super_resolution_img = tf.clip_by_value(super_resolution_img, 0, 255)
        # Rounds the values of a tensor to the nearest integer
        super_resolution_img = tf.round(super_resolution_img)
        # Removes dimensions of size 1 from the shape of a tensor and converting to uint8
        super_resolution_img = tf.squeeze(
            tf.cast(super_resolution_img, tf.uint8), axis=0
        )
        return super_resolution_img

Residual Blocks¶

In [9]:
# Residual Block
def ResBlock(inputs):
    x = layers.Conv2D(64, 3, padding="same", activation="relu")(inputs)
    x = layers.Conv2D(64, 3, padding="same")(x)
    x = layers.Add()([inputs, x])
    return x

Upsampling Blocks¶

In [10]:
# Upsampling Block
def Upsampling(inputs, upblock='SubPixelConv', factor=2, **kwargs):
    if upblock=='Conv2DTranspose':
        x = layers.Conv2DTranspose(64,3, strides=factor, padding="same")(inputs)
        x = layers.Conv2DTranspose(64,3, strides=factor, padding="same")(x)
    elif upblock=='UpSampling2D':
        x = layers.UpSampling2D(factor)(inputs)
        x = layers.UpSampling2D(factor)(x)
    else:
        # Sub-pixel convolution
        x = layers.Conv2D(64 * (factor ** 2), 3, padding="same", **kwargs)(inputs)
        # pixel shuffle
        x = tf.nn.depth_to_space(x, block_size=factor)
        x = layers.Conv2D(64 * (factor ** 2), 3, padding="same", **kwargs)(x)
        # pixel shuffle
        x = tf.nn.depth_to_space(x, block_size=factor)   
        
    return x

Model generation function¶

In [11]:
def make_model(num_filters, num_of_residual_blocks, upblock='SubPixelConv'):
    """ 
    num_filters: Number of kernels
    num_of_residual_blocks: Number of residual block
    upblock: depth_to_space, Conv2DTranspose or UpSampling2D
    """
    # Flexible Inputs to input_layerConv2DTranspose
    input_layer = layers.Input(shape=(None, None, 3))
    # Scaling Pixel Values
    x = layers.Rescaling(scale=1.0 / 255)(input_layer)
    x = x_new = layers.Conv2D(num_filters, 3, padding="same")(x)

    # residual blocks
    for _ in range(num_of_residual_blocks):
        x_new = ResBlock(x_new)

    x_new = layers.Conv2D(num_filters, 3, padding="same")(x_new)
    x = layers.Add()([x, x_new])

    x = Upsampling(x, upblock=upblock)
    x = layers.Conv2D(3, 3, padding="same")(x)

    output_layer = layers.Rescaling(scale=255)(x)
    return EDSRModel(input_layer, output_layer)

Training Model variations¶

In [12]:
import time


number_residual_blocks = [2, 16]
upsample_block_types = ['Conv2DTranspose', 'UpSampling2D', 'SubPixelConv']

for nrs in number_residual_blocks:
    for ublock in upsample_block_types: 
        print (f"Train with Residual Blocks: {nrs} and Upsample method: {ublock}")
        
        model = make_model(num_filters=64, num_of_residual_blocks=nrs, upblock=ublock)
        
        # Using adam optimizer with initial learning rate as 1e-4, changing learning rate after 5000 steps to 5e-5
        optim_edsr = keras.optimizers.Adam(
            learning_rate=keras.optimizers.schedules.PiecewiseConstantDecay(
                boundaries=[5000], values=[1e-4, 5e-5]
            )
        )
        
        # print model summary
        model.summary()
        
        # Compiling model with loss as mean absolute error(L1 Loss) and metric as psnr
        model.compile(optimizer=optim_edsr, loss="mae", metrics=[PSNR])
        
        
        # Callbacks       
        model_file_path = f"Resblock{nrs}_{ublock}.keras"        
        callbacks = [
            keras.callbacks.ModelCheckpoint(
                filepath=model_file_path,
                save_best_only=True,
                monitor="val_PSNR",
                mode='max')
        ]
        
        # Train the model
        t0 = time.time()
        history = model.fit(
                train_ds,
                epochs=100,
                steps_per_epoch=200,
                validation_data=val_ds,
                callbacks=callbacks
        )
        
        print("Training time:", time.time()-t0)
        
        # Plot loss and psnr
        psnr = history.history["PSNR"]
        val_psnr = history.history["val_PSNR"]
        loss = history.history["loss"]
        val_loss = history.history["val_loss"]
        epochs = range(1, len(psnr) + 1)
        plt.plot(epochs, psnr, "bo", label="Training PSNR")
        plt.plot(epochs, val_psnr, "b", label="Validation PSNR")
        plt.title("Training and validation PSNR")
        plt.legend()
        plt.figure()
        plt.plot(epochs, loss, "bo", label="Training loss")
        plt.plot(epochs, val_loss, "b", label="Validation loss")
        plt.title("Training and validation loss")
        plt.legend()
        plt.show()
        
        # Show generated images
        for lowres, highres in val.take(6):
            lowres = tf.image.random_crop(lowres, (150, 150, 3))
            preds = model.predict_step(lowres)
            plot_results(lowres, preds)
Train with Residual Blocks: 2 and Upsample method: Conv2DTranspose
Model: "edsr_model"
__________________________________________________________________________________________________
 Layer (type)                   Output Shape         Param #     Connected to                     
==================================================================================================
 input_1 (InputLayer)           [(None, None, None,  0           []                               
                                 3)]                                                              
                                                                                                  
 rescaling (Rescaling)          (None, None, None,   0           ['input_1[0][0]']                
                                3)                                                                
                                                                                                  
 conv2d (Conv2D)                (None, None, None,   1792        ['rescaling[0][0]']              
                                64)                                                               
                                                                                                  
 conv2d_1 (Conv2D)              (None, None, None,   36928       ['conv2d[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_2 (Conv2D)              (None, None, None,   36928       ['conv2d_1[0][0]']               
                                64)                                                               
                                                                                                  
 add (Add)                      (None, None, None,   0           ['conv2d[0][0]',                 
                                64)                               'conv2d_2[0][0]']               
                                                                                                  
 conv2d_3 (Conv2D)              (None, None, None,   36928       ['add[0][0]']                    
                                64)                                                               
                                                                                                  
 conv2d_4 (Conv2D)              (None, None, None,   36928       ['conv2d_3[0][0]']               
                                64)                                                               
                                                                                                  
 add_1 (Add)                    (None, None, None,   0           ['add[0][0]',                    
                                64)                               'conv2d_4[0][0]']               
                                                                                                  
 conv2d_5 (Conv2D)              (None, None, None,   36928       ['add_1[0][0]']                  
                                64)                                                               
                                                                                                  
 add_2 (Add)                    (None, None, None,   0           ['conv2d[0][0]',                 
                                64)                               'conv2d_5[0][0]']               
                                                                                                  
 conv2d_transpose (Conv2DTransp  (None, None, None,   36928      ['add_2[0][0]']                  
 ose)                           64)                                                               
                                                                                                  
 conv2d_transpose_1 (Conv2DTran  (None, None, None,   36928      ['conv2d_transpose[0][0]']       
 spose)                         64)                                                               
                                                                                                  
 conv2d_6 (Conv2D)              (None, None, None,   1731        ['conv2d_transpose_1[0][0]']     
                                3)                                                                
                                                                                                  
 rescaling_1 (Rescaling)        (None, None, None,   0           ['conv2d_6[0][0]']               
                                3)                                                                
                                                                                                  
==================================================================================================
Total params: 262,019
Trainable params: 262,019
Non-trainable params: 0
__________________________________________________________________________________________________
Epoch 1/100
200/200 [==============================] - 93s 386ms/step - loss: 26.0077 - PSNR: 19.3904 - val_loss: 13.4416 - val_PSNR: 23.4018
Epoch 2/100
200/200 [==============================] - 3s 15ms/step - loss: 11.8940 - PSNR: 24.7529 - val_loss: 10.5736 - val_PSNR: 25.8571
Epoch 3/100
200/200 [==============================] - 3s 15ms/step - loss: 10.3121 - PSNR: 26.7114 - val_loss: 9.1780 - val_PSNR: 27.7544
Epoch 4/100
200/200 [==============================] - 3s 15ms/step - loss: 9.9028 - PSNR: 28.1525 - val_loss: 10.7088 - val_PSNR: 27.2197
Epoch 5/100
200/200 [==============================] - 3s 16ms/step - loss: 9.1496 - PSNR: 27.8388 - val_loss: 9.0287 - val_PSNR: 27.9476
Epoch 6/100
200/200 [==============================] - 3s 17ms/step - loss: 9.1355 - PSNR: 28.6734 - val_loss: 9.0062 - val_PSNR: 29.3184
Epoch 7/100
200/200 [==============================] - 3s 15ms/step - loss: 8.7563 - PSNR: 28.7079 - val_loss: 8.1244 - val_PSNR: 29.5043
Epoch 8/100
200/200 [==============================] - 3s 16ms/step - loss: 8.5511 - PSNR: 28.9785 - val_loss: 7.9841 - val_PSNR: 28.4299
Epoch 9/100
200/200 [==============================] - 3s 15ms/step - loss: 8.3824 - PSNR: 28.6353 - val_loss: 8.4926 - val_PSNR: 28.2222
Epoch 10/100
200/200 [==============================] - 3s 15ms/step - loss: 8.2338 - PSNR: 29.0945 - val_loss: 8.5007 - val_PSNR: 30.7831
Epoch 11/100
200/200 [==============================] - 3s 15ms/step - loss: 8.2211 - PSNR: 29.7683 - val_loss: 8.1780 - val_PSNR: 30.4173
Epoch 12/100
200/200 [==============================] - 3s 16ms/step - loss: 8.0517 - PSNR: 30.7399 - val_loss: 8.6715 - val_PSNR: 27.9523
Epoch 13/100
200/200 [==============================] - 3s 15ms/step - loss: 8.1239 - PSNR: 30.1587 - val_loss: 9.0686 - val_PSNR: 32.0912
Epoch 14/100
200/200 [==============================] - 3s 15ms/step - loss: 8.0700 - PSNR: 30.4254 - val_loss: 7.3775 - val_PSNR: 26.5481
Epoch 15/100
200/200 [==============================] - 3s 15ms/step - loss: 8.0031 - PSNR: 30.4698 - val_loss: 8.1247 - val_PSNR: 27.4976
Epoch 16/100
200/200 [==============================] - 4s 18ms/step - loss: 8.0308 - PSNR: 30.1201 - val_loss: 7.2475 - val_PSNR: 27.3385
Epoch 17/100
200/200 [==============================] - 3s 15ms/step - loss: 8.0368 - PSNR: 30.4586 - val_loss: 7.7127 - val_PSNR: 32.2875
Epoch 18/100
200/200 [==============================] - 3s 15ms/step - loss: 7.7633 - PSNR: 30.0297 - val_loss: 7.7721 - val_PSNR: 31.3958
Epoch 19/100
200/200 [==============================] - 3s 16ms/step - loss: 7.9631 - PSNR: 29.7005 - val_loss: 7.3407 - val_PSNR: 27.9643
Epoch 20/100
200/200 [==============================] - 3s 16ms/step - loss: 7.9111 - PSNR: 30.4711 - val_loss: 7.1096 - val_PSNR: 31.0870
Epoch 21/100
200/200 [==============================] - 3s 15ms/step - loss: 7.7534 - PSNR: 31.6228 - val_loss: 7.9298 - val_PSNR: 28.5912
Epoch 22/100
200/200 [==============================] - 3s 16ms/step - loss: 7.7802 - PSNR: 29.6846 - val_loss: 6.9319 - val_PSNR: 29.7697
Epoch 23/100
200/200 [==============================] - 3s 15ms/step - loss: 7.6128 - PSNR: 30.9954 - val_loss: 8.2546 - val_PSNR: 29.8548
Epoch 24/100
200/200 [==============================] - 3s 15ms/step - loss: 7.6731 - PSNR: 31.0549 - val_loss: 7.9262 - val_PSNR: 31.9932
Epoch 25/100
200/200 [==============================] - 3s 15ms/step - loss: 7.5875 - PSNR: 30.1604 - val_loss: 7.9776 - val_PSNR: 30.2620
Epoch 26/100
200/200 [==============================] - 4s 18ms/step - loss: 7.4255 - PSNR: 31.3593 - val_loss: 7.8002 - val_PSNR: 27.8883
Epoch 27/100
200/200 [==============================] - 3s 16ms/step - loss: 7.6232 - PSNR: 31.1332 - val_loss: 8.5745 - val_PSNR: 27.8353
Epoch 28/100
200/200 [==============================] - 3s 16ms/step - loss: 7.5832 - PSNR: 31.2332 - val_loss: 7.6236 - val_PSNR: 32.5498
Epoch 29/100
200/200 [==============================] - 3s 16ms/step - loss: 7.5462 - PSNR: 30.9910 - val_loss: 6.5103 - val_PSNR: 31.4960
Epoch 30/100
200/200 [==============================] - 3s 15ms/step - loss: 7.5361 - PSNR: 30.6922 - val_loss: 8.7082 - val_PSNR: 35.9414
Epoch 31/100
200/200 [==============================] - 3s 15ms/step - loss: 7.6133 - PSNR: 30.4844 - val_loss: 6.8355 - val_PSNR: 28.7111
Epoch 32/100
200/200 [==============================] - 3s 15ms/step - loss: 7.4205 - PSNR: 31.4966 - val_loss: 6.3846 - val_PSNR: 34.6890
Epoch 33/100
200/200 [==============================] - 3s 16ms/step - loss: 7.4109 - PSNR: 31.6791 - val_loss: 7.2089 - val_PSNR: 30.6551
Epoch 34/100
200/200 [==============================] - 3s 15ms/step - loss: 7.6131 - PSNR: 31.1010 - val_loss: 7.3136 - val_PSNR: 33.6192
Epoch 35/100
200/200 [==============================] - 3s 16ms/step - loss: 7.4030 - PSNR: 31.9738 - val_loss: 7.3695 - val_PSNR: 30.9103
Epoch 36/100
200/200 [==============================] - 4s 18ms/step - loss: 7.5771 - PSNR: 31.0970 - val_loss: 7.8494 - val_PSNR: 32.1128
Epoch 37/100
200/200 [==============================] - 3s 15ms/step - loss: 7.3825 - PSNR: 30.5622 - val_loss: 6.8422 - val_PSNR: 28.6725
Epoch 38/100
200/200 [==============================] - 3s 15ms/step - loss: 7.5327 - PSNR: 32.1507 - val_loss: 7.2841 - val_PSNR: 32.4546
Epoch 39/100
200/200 [==============================] - 3s 15ms/step - loss: 7.4388 - PSNR: 32.1707 - val_loss: 7.5735 - val_PSNR: 29.3427
Epoch 40/100
200/200 [==============================] - 3s 16ms/step - loss: 7.2662 - PSNR: 30.4588 - val_loss: 7.4548 - val_PSNR: 34.4893
Epoch 41/100
200/200 [==============================] - 3s 16ms/step - loss: 7.4535 - PSNR: 31.1783 - val_loss: 7.4570 - val_PSNR: 28.8110
Epoch 42/100
200/200 [==============================] - 3s 15ms/step - loss: 7.3887 - PSNR: 31.6778 - val_loss: 7.1370 - val_PSNR: 29.4220
Epoch 43/100
200/200 [==============================] - 3s 16ms/step - loss: 7.5270 - PSNR: 30.9984 - val_loss: 7.7197 - val_PSNR: 34.1324
Epoch 44/100
200/200 [==============================] - 3s 15ms/step - loss: 7.5064 - PSNR: 31.7934 - val_loss: 7.1305 - val_PSNR: 27.0228
Epoch 45/100
200/200 [==============================] - 3s 15ms/step - loss: 7.4614 - PSNR: 32.4603 - val_loss: 6.9087 - val_PSNR: 32.2898
Epoch 46/100
200/200 [==============================] - 3s 17ms/step - loss: 7.4776 - PSNR: 30.9727 - val_loss: 7.3854 - val_PSNR: 35.0505
Epoch 47/100
200/200 [==============================] - 3s 17ms/step - loss: 7.3190 - PSNR: 31.6789 - val_loss: 7.5634 - val_PSNR: 29.8970
Epoch 48/100
200/200 [==============================] - 3s 16ms/step - loss: 7.3433 - PSNR: 31.6313 - val_loss: 6.8460 - val_PSNR: 28.8808
Epoch 49/100
200/200 [==============================] - 3s 16ms/step - loss: 7.4155 - PSNR: 32.6432 - val_loss: 7.0619 - val_PSNR: 31.4479
Epoch 50/100
200/200 [==============================] - 3s 16ms/step - loss: 7.2926 - PSNR: 32.0654 - val_loss: 8.0797 - val_PSNR: 31.4299
Epoch 51/100
200/200 [==============================] - 3s 15ms/step - loss: 7.4200 - PSNR: 31.8993 - val_loss: 7.8445 - val_PSNR: 31.1228
Epoch 52/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2554 - PSNR: 31.1811 - val_loss: 7.6315 - val_PSNR: 30.1349
Epoch 53/100
200/200 [==============================] - 3s 15ms/step - loss: 7.4110 - PSNR: 31.4295 - val_loss: 7.2569 - val_PSNR: 32.3992
Epoch 54/100
200/200 [==============================] - 3s 16ms/step - loss: 7.2093 - PSNR: 32.1411 - val_loss: 6.1811 - val_PSNR: 31.0903
Epoch 55/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2782 - PSNR: 31.2167 - val_loss: 6.7976 - val_PSNR: 32.9117
Epoch 56/100
200/200 [==============================] - 3s 17ms/step - loss: 7.3198 - PSNR: 31.5309 - val_loss: 6.3761 - val_PSNR: 28.8345
Epoch 57/100
200/200 [==============================] - 3s 16ms/step - loss: 7.3257 - PSNR: 31.9350 - val_loss: 7.0561 - val_PSNR: 31.5651
Epoch 58/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2063 - PSNR: 32.6960 - val_loss: 7.9219 - val_PSNR: 27.1854
Epoch 59/100
200/200 [==============================] - 3s 16ms/step - loss: 7.3587 - PSNR: 32.5090 - val_loss: 7.5345 - val_PSNR: 27.8207
Epoch 60/100
200/200 [==============================] - 3s 15ms/step - loss: 7.3397 - PSNR: 31.3378 - val_loss: 7.4077 - val_PSNR: 33.9306
Epoch 61/100
200/200 [==============================] - 3s 16ms/step - loss: 7.3980 - PSNR: 31.3554 - val_loss: 7.5543 - val_PSNR: 30.5030
Epoch 62/100
200/200 [==============================] - 3s 15ms/step - loss: 7.0556 - PSNR: 32.5478 - val_loss: 7.0932 - val_PSNR: 30.0212
Epoch 63/100
200/200 [==============================] - 3s 15ms/step - loss: 7.3135 - PSNR: 31.5336 - val_loss: 7.0836 - val_PSNR: 33.8371
Epoch 64/100
200/200 [==============================] - 3s 16ms/step - loss: 7.3749 - PSNR: 32.0651 - val_loss: 6.4197 - val_PSNR: 32.1102
Epoch 65/100
200/200 [==============================] - 3s 15ms/step - loss: 7.3347 - PSNR: 31.3794 - val_loss: 8.2106 - val_PSNR: 30.4552
Epoch 66/100
200/200 [==============================] - 3s 17ms/step - loss: 7.1942 - PSNR: 30.7372 - val_loss: 7.1274 - val_PSNR: 29.9191
Epoch 67/100
200/200 [==============================] - 3s 15ms/step - loss: 7.1405 - PSNR: 32.1929 - val_loss: 7.6208 - val_PSNR: 33.9641
Epoch 68/100
200/200 [==============================] - 3s 16ms/step - loss: 7.1843 - PSNR: 31.4865 - val_loss: 6.2029 - val_PSNR: 33.4571
Epoch 69/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2105 - PSNR: 32.2632 - val_loss: 7.5698 - val_PSNR: 29.9897
Epoch 70/100
200/200 [==============================] - 3s 15ms/step - loss: 7.3114 - PSNR: 31.7595 - val_loss: 7.7389 - val_PSNR: 30.6360
Epoch 71/100
200/200 [==============================] - 3s 16ms/step - loss: 7.3465 - PSNR: 31.9711 - val_loss: 7.8861 - val_PSNR: 28.1419
Epoch 72/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2778 - PSNR: 31.9102 - val_loss: 7.2720 - val_PSNR: 33.0383
Epoch 73/100
200/200 [==============================] - 3s 15ms/step - loss: 7.3258 - PSNR: 31.7433 - val_loss: 6.2241 - val_PSNR: 30.7463
Epoch 74/100
200/200 [==============================] - 3s 15ms/step - loss: 7.3134 - PSNR: 32.8456 - val_loss: 7.3215 - val_PSNR: 32.6406
Epoch 75/100
200/200 [==============================] - 3s 16ms/step - loss: 7.1575 - PSNR: 31.9620 - val_loss: 7.4709 - val_PSNR: 29.0692
Epoch 76/100
200/200 [==============================] - 3s 17ms/step - loss: 7.1337 - PSNR: 32.3675 - val_loss: 8.2858 - val_PSNR: 27.5748
Epoch 77/100
200/200 [==============================] - 3s 15ms/step - loss: 7.1593 - PSNR: 32.2862 - val_loss: 6.8388 - val_PSNR: 30.6326
Epoch 78/100
200/200 [==============================] - 3s 16ms/step - loss: 7.1791 - PSNR: 32.5867 - val_loss: 7.4485 - val_PSNR: 32.4032
Epoch 79/100
200/200 [==============================] - 3s 16ms/step - loss: 7.2565 - PSNR: 32.4317 - val_loss: 7.9052 - val_PSNR: 32.0296
Epoch 80/100
200/200 [==============================] - 3s 15ms/step - loss: 7.1561 - PSNR: 31.5694 - val_loss: 6.6874 - val_PSNR: 27.7263
Epoch 81/100
200/200 [==============================] - 3s 15ms/step - loss: 7.4095 - PSNR: 32.1322 - val_loss: 7.4726 - val_PSNR: 36.0839
Epoch 82/100
200/200 [==============================] - 3s 16ms/step - loss: 7.2979 - PSNR: 30.6270 - val_loss: 7.3171 - val_PSNR: 31.3009
Epoch 83/100
200/200 [==============================] - 3s 15ms/step - loss: 7.0970 - PSNR: 32.1242 - val_loss: 7.7989 - val_PSNR: 28.6302
Epoch 84/100
200/200 [==============================] - 3s 15ms/step - loss: 7.0221 - PSNR: 31.8094 - val_loss: 7.3101 - val_PSNR: 33.4611
Epoch 85/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2803 - PSNR: 32.1810 - val_loss: 7.0450 - val_PSNR: 31.1707
Epoch 86/100
200/200 [==============================] - 4s 18ms/step - loss: 7.2628 - PSNR: 32.5114 - val_loss: 7.2823 - val_PSNR: 31.3794
Epoch 87/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2140 - PSNR: 32.3303 - val_loss: 6.8339 - val_PSNR: 32.3519
Epoch 88/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2095 - PSNR: 32.1476 - val_loss: 7.1493 - val_PSNR: 32.0670
Epoch 89/100
200/200 [==============================] - 3s 16ms/step - loss: 7.1159 - PSNR: 32.3952 - val_loss: 6.3641 - val_PSNR: 32.6039
Epoch 90/100
200/200 [==============================] - 3s 15ms/step - loss: 7.1539 - PSNR: 32.1592 - val_loss: 7.7503 - val_PSNR: 31.6548
Epoch 91/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2255 - PSNR: 31.9065 - val_loss: 7.5329 - val_PSNR: 29.0174
Epoch 92/100
200/200 [==============================] - 3s 15ms/step - loss: 7.1002 - PSNR: 32.1207 - val_loss: 5.9963 - val_PSNR: 35.8053
Epoch 93/100
200/200 [==============================] - 3s 16ms/step - loss: 7.1534 - PSNR: 32.2909 - val_loss: 7.5460 - val_PSNR: 29.2607
Epoch 94/100
200/200 [==============================] - 3s 15ms/step - loss: 7.3520 - PSNR: 32.2647 - val_loss: 7.0466 - val_PSNR: 33.7377
Epoch 95/100
200/200 [==============================] - 3s 15ms/step - loss: 7.1643 - PSNR: 32.4732 - val_loss: 6.0990 - val_PSNR: 30.7888
Epoch 96/100
200/200 [==============================] - 4s 18ms/step - loss: 7.1104 - PSNR: 32.5177 - val_loss: 7.2092 - val_PSNR: 29.1882
Epoch 97/100
200/200 [==============================] - 3s 15ms/step - loss: 7.1933 - PSNR: 32.0795 - val_loss: 7.4523 - val_PSNR: 31.0217
Epoch 98/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2350 - PSNR: 32.3973 - val_loss: 7.8097 - val_PSNR: 25.7138
Epoch 99/100
200/200 [==============================] - 3s 15ms/step - loss: 7.1280 - PSNR: 31.2235 - val_loss: 6.4538 - val_PSNR: 35.5771
Epoch 100/100
200/200 [==============================] - 3s 16ms/step - loss: 7.0389 - PSNR: 32.6359 - val_loss: 6.6791 - val_PSNR: 29.5880
Training time: 403.40916323661804
Train with Residual Blocks: 2 and Upsample method: UpSampling2D
Model: "edsr_model_1"
__________________________________________________________________________________________________
 Layer (type)                   Output Shape         Param #     Connected to                     
==================================================================================================
 input_2 (InputLayer)           [(None, None, None,  0           []                               
                                 3)]                                                              
                                                                                                  
 rescaling_2 (Rescaling)        (None, None, None,   0           ['input_2[0][0]']                
                                3)                                                                
                                                                                                  
 conv2d_7 (Conv2D)              (None, None, None,   1792        ['rescaling_2[0][0]']            
                                64)                                                               
                                                                                                  
 conv2d_8 (Conv2D)              (None, None, None,   36928       ['conv2d_7[0][0]']               
                                64)                                                               
                                                                                                  
 conv2d_9 (Conv2D)              (None, None, None,   36928       ['conv2d_8[0][0]']               
                                64)                                                               
                                                                                                  
 add_3 (Add)                    (None, None, None,   0           ['conv2d_7[0][0]',               
                                64)                               'conv2d_9[0][0]']               
                                                                                                  
 conv2d_10 (Conv2D)             (None, None, None,   36928       ['add_3[0][0]']                  
                                64)                                                               
                                                                                                  
 conv2d_11 (Conv2D)             (None, None, None,   36928       ['conv2d_10[0][0]']              
                                64)                                                               
                                                                                                  
 add_4 (Add)                    (None, None, None,   0           ['add_3[0][0]',                  
                                64)                               'conv2d_11[0][0]']              
                                                                                                  
 conv2d_12 (Conv2D)             (None, None, None,   36928       ['add_4[0][0]']                  
                                64)                                                               
                                                                                                  
 add_5 (Add)                    (None, None, None,   0           ['conv2d_7[0][0]',               
                                64)                               'conv2d_12[0][0]']              
                                                                                                  
 up_sampling2d (UpSampling2D)   (None, None, None,   0           ['add_5[0][0]']                  
                                64)                                                               
                                                                                                  
 up_sampling2d_1 (UpSampling2D)  (None, None, None,   0          ['up_sampling2d[0][0]']          
                                64)                                                               
                                                                                                  
 conv2d_13 (Conv2D)             (None, None, None,   1731        ['up_sampling2d_1[0][0]']        
                                3)                                                                
                                                                                                  
 rescaling_3 (Rescaling)        (None, None, None,   0           ['conv2d_13[0][0]']              
                                3)                                                                
                                                                                                  
==================================================================================================
Total params: 188,163
Trainable params: 188,163
Non-trainable params: 0
__________________________________________________________________________________________________
Epoch 1/100
200/200 [==============================] - 5s 12ms/step - loss: 17.5611 - PSNR: 22.2566 - val_loss: 11.3832 - val_PSNR: 23.8369
Epoch 2/100
200/200 [==============================] - 2s 11ms/step - loss: 10.7237 - PSNR: 25.5171 - val_loss: 10.4641 - val_PSNR: 24.2857
Epoch 3/100
200/200 [==============================] - 2s 11ms/step - loss: 10.0345 - PSNR: 26.8515 - val_loss: 9.5696 - val_PSNR: 28.8364
Epoch 4/100
200/200 [==============================] - 3s 13ms/step - loss: 9.2088 - PSNR: 28.0879 - val_loss: 9.1982 - val_PSNR: 28.3943
Epoch 5/100
200/200 [==============================] - 2s 12ms/step - loss: 9.0015 - PSNR: 28.5345 - val_loss: 7.8085 - val_PSNR: 30.8780
Epoch 6/100
200/200 [==============================] - 2s 11ms/step - loss: 8.9406 - PSNR: 28.3081 - val_loss: 9.0822 - val_PSNR: 26.6822
Epoch 7/100
200/200 [==============================] - 2s 11ms/step - loss: 8.5874 - PSNR: 28.8237 - val_loss: 8.8984 - val_PSNR: 25.7619
Epoch 8/100
200/200 [==============================] - 2s 11ms/step - loss: 8.4927 - PSNR: 29.2311 - val_loss: 8.5469 - val_PSNR: 29.2415
Epoch 9/100
200/200 [==============================] - 2s 11ms/step - loss: 8.7775 - PSNR: 28.9736 - val_loss: 8.4969 - val_PSNR: 29.7744
Epoch 10/100
200/200 [==============================] - 2s 12ms/step - loss: 8.1410 - PSNR: 29.4230 - val_loss: 7.9586 - val_PSNR: 29.9160
Epoch 11/100
200/200 [==============================] - 2s 11ms/step - loss: 8.2344 - PSNR: 29.5408 - val_loss: 8.9374 - val_PSNR: 30.9008
Epoch 12/100
200/200 [==============================] - 2s 11ms/step - loss: 8.4432 - PSNR: 28.6260 - val_loss: 7.9583 - val_PSNR: 30.1378
Epoch 13/100
200/200 [==============================] - 2s 11ms/step - loss: 8.1569 - PSNR: 29.8930 - val_loss: 7.6480 - val_PSNR: 29.6612
Epoch 14/100
200/200 [==============================] - 2s 11ms/step - loss: 8.1597 - PSNR: 29.9833 - val_loss: 8.7775 - val_PSNR: 25.4647
Epoch 15/100
200/200 [==============================] - 2s 12ms/step - loss: 8.3390 - PSNR: 30.2917 - val_loss: 7.8353 - val_PSNR: 31.3252
Epoch 16/100
200/200 [==============================] - 2s 11ms/step - loss: 8.1227 - PSNR: 30.9358 - val_loss: 7.9261 - val_PSNR: 25.0423
Epoch 17/100
200/200 [==============================] - 2s 11ms/step - loss: 8.1535 - PSNR: 29.8010 - val_loss: 8.3149 - val_PSNR: 25.1602
Epoch 18/100
200/200 [==============================] - 3s 13ms/step - loss: 8.0193 - PSNR: 29.4051 - val_loss: 7.5946 - val_PSNR: 28.2554
Epoch 19/100
200/200 [==============================] - 2s 11ms/step - loss: 7.8342 - PSNR: 30.3961 - val_loss: 8.1935 - val_PSNR: 25.6649
Epoch 20/100
200/200 [==============================] - 2s 12ms/step - loss: 7.9009 - PSNR: 30.4967 - val_loss: 7.0708 - val_PSNR: 29.6138
Epoch 21/100
200/200 [==============================] - 2s 11ms/step - loss: 8.0169 - PSNR: 29.8382 - val_loss: 8.0082 - val_PSNR: 29.6785
Epoch 22/100
200/200 [==============================] - 2s 11ms/step - loss: 7.9334 - PSNR: 29.8923 - val_loss: 7.2025 - val_PSNR: 31.4152
Epoch 23/100
200/200 [==============================] - 2s 11ms/step - loss: 8.0666 - PSNR: 30.5321 - val_loss: 8.5540 - val_PSNR: 30.9307
Epoch 24/100
200/200 [==============================] - 2s 11ms/step - loss: 7.8504 - PSNR: 30.2995 - val_loss: 7.1991 - val_PSNR: 32.0603
Epoch 25/100
200/200 [==============================] - 2s 11ms/step - loss: 8.0768 - PSNR: 30.9172 - val_loss: 8.9309 - val_PSNR: 25.9212
Epoch 26/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6855 - PSNR: 31.0251 - val_loss: 8.3858 - val_PSNR: 30.7971
Epoch 27/100
200/200 [==============================] - 2s 11ms/step - loss: 7.8098 - PSNR: 30.3536 - val_loss: 8.6183 - val_PSNR: 29.0548
Epoch 28/100
200/200 [==============================] - 2s 10ms/step - loss: 7.5904 - PSNR: 30.7013 - val_loss: 7.8551 - val_PSNR: 26.3744
Epoch 29/100
200/200 [==============================] - 2s 11ms/step - loss: 7.7751 - PSNR: 30.5357 - val_loss: 7.6300 - val_PSNR: 29.4049
Epoch 30/100
200/200 [==============================] - 2s 12ms/step - loss: 7.7399 - PSNR: 30.0042 - val_loss: 8.2291 - val_PSNR: 31.6439
Epoch 31/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6599 - PSNR: 30.2525 - val_loss: 7.6719 - val_PSNR: 29.9416
Epoch 32/100
200/200 [==============================] - 3s 13ms/step - loss: 7.7942 - PSNR: 31.3151 - val_loss: 7.7812 - val_PSNR: 29.3711
Epoch 33/100
200/200 [==============================] - 2s 11ms/step - loss: 7.7049 - PSNR: 30.9419 - val_loss: 7.3386 - val_PSNR: 30.8990
Epoch 34/100
200/200 [==============================] - 2s 12ms/step - loss: 7.6630 - PSNR: 31.0804 - val_loss: 8.5485 - val_PSNR: 33.0256
Epoch 35/100
200/200 [==============================] - 2s 12ms/step - loss: 7.8399 - PSNR: 30.4895 - val_loss: 8.2928 - val_PSNR: 30.1722
Epoch 36/100
200/200 [==============================] - 2s 11ms/step - loss: 7.7882 - PSNR: 30.9927 - val_loss: 8.1888 - val_PSNR: 30.1871
Epoch 37/100
200/200 [==============================] - 2s 11ms/step - loss: 7.7558 - PSNR: 30.8189 - val_loss: 7.5333 - val_PSNR: 35.2163
Epoch 38/100
200/200 [==============================] - 2s 11ms/step - loss: 7.7091 - PSNR: 32.2192 - val_loss: 8.0536 - val_PSNR: 29.2348
Epoch 39/100
200/200 [==============================] - 2s 12ms/step - loss: 7.6574 - PSNR: 30.4475 - val_loss: 8.0398 - val_PSNR: 29.5164
Epoch 40/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6915 - PSNR: 31.2020 - val_loss: 7.4726 - val_PSNR: 31.1509
Epoch 41/100
200/200 [==============================] - 2s 11ms/step - loss: 7.8268 - PSNR: 30.9431 - val_loss: 7.0765 - val_PSNR: 29.6852
Epoch 42/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6525 - PSNR: 30.5289 - val_loss: 7.9162 - val_PSNR: 31.7759
Epoch 43/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6011 - PSNR: 30.9461 - val_loss: 7.5999 - val_PSNR: 34.5599
Epoch 44/100
200/200 [==============================] - 2s 12ms/step - loss: 7.5630 - PSNR: 30.3174 - val_loss: 8.0693 - val_PSNR: 28.7737
Epoch 45/100
200/200 [==============================] - 2s 11ms/step - loss: 7.7182 - PSNR: 31.0691 - val_loss: 7.8853 - val_PSNR: 30.8070
Epoch 46/100
200/200 [==============================] - 3s 13ms/step - loss: 7.7203 - PSNR: 31.6575 - val_loss: 7.7245 - val_PSNR: 30.0153
Epoch 47/100
200/200 [==============================] - 2s 11ms/step - loss: 7.4670 - PSNR: 30.8821 - val_loss: 8.4535 - val_PSNR: 31.8248
Epoch 48/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5366 - PSNR: 30.9305 - val_loss: 7.2159 - val_PSNR: 30.8327
Epoch 49/100
200/200 [==============================] - 3s 13ms/step - loss: 7.5408 - PSNR: 31.3042 - val_loss: 8.4665 - val_PSNR: 28.4591
Epoch 50/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6089 - PSNR: 31.6965 - val_loss: 7.2770 - val_PSNR: 29.3248
Epoch 51/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5446 - PSNR: 31.6413 - val_loss: 7.0572 - val_PSNR: 30.1849
Epoch 52/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6609 - PSNR: 31.8325 - val_loss: 7.3926 - val_PSNR: 34.5944
Epoch 53/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5942 - PSNR: 30.4697 - val_loss: 7.8329 - val_PSNR: 28.4736
Epoch 54/100
200/200 [==============================] - 2s 12ms/step - loss: 7.4622 - PSNR: 31.8439 - val_loss: 6.8339 - val_PSNR: 31.6967
Epoch 55/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5471 - PSNR: 31.6680 - val_loss: 6.9598 - val_PSNR: 34.0424
Epoch 56/100
200/200 [==============================] - 2s 11ms/step - loss: 7.7097 - PSNR: 31.7279 - val_loss: 7.5109 - val_PSNR: 33.0944
Epoch 57/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6459 - PSNR: 31.2620 - val_loss: 7.6807 - val_PSNR: 36.2936
Epoch 58/100
200/200 [==============================] - 2s 11ms/step - loss: 7.7026 - PSNR: 31.3578 - val_loss: 7.1780 - val_PSNR: 29.9805
Epoch 59/100
200/200 [==============================] - 2s 12ms/step - loss: 7.5476 - PSNR: 32.1722 - val_loss: 7.8625 - val_PSNR: 32.3896
Epoch 60/100
200/200 [==============================] - 3s 13ms/step - loss: 7.5743 - PSNR: 30.8480 - val_loss: 7.5567 - val_PSNR: 32.2495
Epoch 61/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6876 - PSNR: 30.6280 - val_loss: 7.9887 - val_PSNR: 30.5825
Epoch 62/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6747 - PSNR: 30.9946 - val_loss: 7.4555 - val_PSNR: 33.1230
Epoch 63/100
200/200 [==============================] - 2s 12ms/step - loss: 7.7036 - PSNR: 30.7404 - val_loss: 8.2712 - val_PSNR: 33.4524
Epoch 64/100
200/200 [==============================] - 2s 12ms/step - loss: 7.5320 - PSNR: 30.5849 - val_loss: 7.4396 - val_PSNR: 27.9853
Epoch 65/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5225 - PSNR: 32.0149 - val_loss: 7.3528 - val_PSNR: 33.0964
Epoch 66/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6687 - PSNR: 30.4002 - val_loss: 7.7787 - val_PSNR: 29.2257
Epoch 67/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5971 - PSNR: 30.5941 - val_loss: 7.3687 - val_PSNR: 28.6404
Epoch 68/100
200/200 [==============================] - 2s 12ms/step - loss: 7.6689 - PSNR: 31.2422 - val_loss: 8.4635 - val_PSNR: 26.8450
Epoch 69/100
200/200 [==============================] - 2s 11ms/step - loss: 7.4691 - PSNR: 31.6019 - val_loss: 7.3259 - val_PSNR: 31.0962
Epoch 70/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5107 - PSNR: 31.0272 - val_loss: 8.2699 - val_PSNR: 27.9997
Epoch 71/100
200/200 [==============================] - 2s 11ms/step - loss: 7.3728 - PSNR: 31.0277 - val_loss: 8.5988 - val_PSNR: 32.3047
Epoch 72/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5932 - PSNR: 30.8775 - val_loss: 7.0831 - val_PSNR: 27.1569
Epoch 73/100
200/200 [==============================] - 2s 11ms/step - loss: 7.4930 - PSNR: 31.1903 - val_loss: 8.1660 - val_PSNR: 34.4056
Epoch 74/100
200/200 [==============================] - 3s 14ms/step - loss: 7.5663 - PSNR: 31.2638 - val_loss: 6.5741 - val_PSNR: 33.0446
Epoch 75/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6192 - PSNR: 31.4063 - val_loss: 7.3789 - val_PSNR: 30.8262
Epoch 76/100
200/200 [==============================] - 2s 11ms/step - loss: 7.4618 - PSNR: 30.7345 - val_loss: 7.7127 - val_PSNR: 31.2482
Epoch 77/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6059 - PSNR: 31.3496 - val_loss: 7.5903 - val_PSNR: 31.8960
Epoch 78/100
200/200 [==============================] - 2s 12ms/step - loss: 7.5511 - PSNR: 31.0718 - val_loss: 7.3490 - val_PSNR: 30.4218
Epoch 79/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6665 - PSNR: 30.2928 - val_loss: 8.6211 - val_PSNR: 29.7174
Epoch 80/100
200/200 [==============================] - 2s 11ms/step - loss: 7.4879 - PSNR: 32.1696 - val_loss: 7.3044 - val_PSNR: 32.4277
Epoch 81/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6148 - PSNR: 31.6257 - val_loss: 6.9130 - val_PSNR: 30.6814
Epoch 82/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6288 - PSNR: 32.1312 - val_loss: 7.7965 - val_PSNR: 32.0039
Epoch 83/100
200/200 [==============================] - 2s 12ms/step - loss: 7.5904 - PSNR: 30.8474 - val_loss: 7.4848 - val_PSNR: 28.7674
Epoch 84/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5996 - PSNR: 32.0202 - val_loss: 7.2823 - val_PSNR: 32.5592
Epoch 85/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5810 - PSNR: 31.2210 - val_loss: 8.1691 - val_PSNR: 25.8974
Epoch 86/100
200/200 [==============================] - 2s 11ms/step - loss: 7.7635 - PSNR: 31.3348 - val_loss: 7.3638 - val_PSNR: 32.4020
Epoch 87/100
200/200 [==============================] - 2s 12ms/step - loss: 7.5876 - PSNR: 31.7768 - val_loss: 7.8225 - val_PSNR: 31.3660
Epoch 88/100
200/200 [==============================] - 3s 13ms/step - loss: 7.5301 - PSNR: 31.6759 - val_loss: 8.3169 - val_PSNR: 32.3539
Epoch 89/100
200/200 [==============================] - 2s 11ms/step - loss: 7.6006 - PSNR: 31.6251 - val_loss: 7.7150 - val_PSNR: 30.6743
Epoch 90/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5246 - PSNR: 30.8012 - val_loss: 7.5590 - val_PSNR: 28.2063
Epoch 91/100
200/200 [==============================] - 2s 12ms/step - loss: 7.5014 - PSNR: 31.1159 - val_loss: 8.1312 - val_PSNR: 31.0736
Epoch 92/100
200/200 [==============================] - 2s 11ms/step - loss: 7.4756 - PSNR: 31.6070 - val_loss: 7.7618 - val_PSNR: 26.8834
Epoch 93/100
200/200 [==============================] - 2s 12ms/step - loss: 7.6049 - PSNR: 31.6874 - val_loss: 7.1874 - val_PSNR: 28.4668
Epoch 94/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5835 - PSNR: 32.0234 - val_loss: 6.7026 - val_PSNR: 33.3985
Epoch 95/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5406 - PSNR: 31.1873 - val_loss: 7.0961 - val_PSNR: 28.7118
Epoch 96/100
200/200 [==============================] - 2s 11ms/step - loss: 7.3615 - PSNR: 31.1643 - val_loss: 7.0021 - val_PSNR: 30.8735
Epoch 97/100
200/200 [==============================] - 2s 11ms/step - loss: 7.4684 - PSNR: 31.0948 - val_loss: 6.7321 - val_PSNR: 30.7083
Epoch 98/100
200/200 [==============================] - 2s 12ms/step - loss: 7.6235 - PSNR: 31.3131 - val_loss: 7.1866 - val_PSNR: 30.4694
Epoch 99/100
200/200 [==============================] - 2s 11ms/step - loss: 7.5301 - PSNR: 31.3471 - val_loss: 7.1967 - val_PSNR: 34.8881
Epoch 100/100
200/200 [==============================] - 2s 11ms/step - loss: 7.4021 - PSNR: 31.6633 - val_loss: 7.2394 - val_PSNR: 29.3262
Training time: 263.3086247444153
Train with Residual Blocks: 2 and Upsample method: SubPixelConv
Model: "edsr_model_2"
__________________________________________________________________________________________________
 Layer (type)                   Output Shape         Param #     Connected to                     
==================================================================================================
 input_3 (InputLayer)           [(None, None, None,  0           []                               
                                 3)]                                                              
                                                                                                  
 rescaling_4 (Rescaling)        (None, None, None,   0           ['input_3[0][0]']                
                                3)                                                                
                                                                                                  
 conv2d_14 (Conv2D)             (None, None, None,   1792        ['rescaling_4[0][0]']            
                                64)                                                               
                                                                                                  
 conv2d_15 (Conv2D)             (None, None, None,   36928       ['conv2d_14[0][0]']              
                                64)                                                               
                                                                                                  
 conv2d_16 (Conv2D)             (None, None, None,   36928       ['conv2d_15[0][0]']              
                                64)                                                               
                                                                                                  
 add_6 (Add)                    (None, None, None,   0           ['conv2d_14[0][0]',              
                                64)                               'conv2d_16[0][0]']              
                                                                                                  
 conv2d_17 (Conv2D)             (None, None, None,   36928       ['add_6[0][0]']                  
                                64)                                                               
                                                                                                  
 conv2d_18 (Conv2D)             (None, None, None,   36928       ['conv2d_17[0][0]']              
                                64)                                                               
                                                                                                  
 add_7 (Add)                    (None, None, None,   0           ['add_6[0][0]',                  
                                64)                               'conv2d_18[0][0]']              
                                                                                                  
 conv2d_19 (Conv2D)             (None, None, None,   36928       ['add_7[0][0]']                  
                                64)                                                               
                                                                                                  
 add_8 (Add)                    (None, None, None,   0           ['conv2d_14[0][0]',              
                                64)                               'conv2d_19[0][0]']              
                                                                                                  
 conv2d_20 (Conv2D)             (None, None, None,   147712      ['add_8[0][0]']                  
                                256)                                                              
                                                                                                  
 tf.nn.depth_to_space (TFOpLamb  (None, None, None,   0          ['conv2d_20[0][0]']              
 da)                            64)                                                               
                                                                                                  
 conv2d_21 (Conv2D)             (None, None, None,   147712      ['tf.nn.depth_to_space[0][0]']   
                                256)                                                              
                                                                                                  
 tf.nn.depth_to_space_1 (TFOpLa  (None, None, None,   0          ['conv2d_21[0][0]']              
 mbda)                          64)                                                               
                                                                                                  
 conv2d_22 (Conv2D)             (None, None, None,   1731        ['tf.nn.depth_to_space_1[0][0]'] 
                                3)                                                                
                                                                                                  
 rescaling_5 (Rescaling)        (None, None, None,   0           ['conv2d_22[0][0]']              
                                3)                                                                
                                                                                                  
==================================================================================================
Total params: 483,587
Trainable params: 483,587
Non-trainable params: 0
__________________________________________________________________________________________________
Epoch 1/100
200/200 [==============================] - 7s 16ms/step - loss: 19.9364 - PSNR: 21.1484 - val_loss: 11.5583 - val_PSNR: 24.9886
Epoch 2/100
200/200 [==============================] - 3s 15ms/step - loss: 10.4361 - PSNR: 25.9898 - val_loss: 9.5414 - val_PSNR: 25.0997
Epoch 3/100
200/200 [==============================] - 3s 14ms/step - loss: 9.7350 - PSNR: 27.5126 - val_loss: 9.1285 - val_PSNR: 27.8923
Epoch 4/100
200/200 [==============================] - 3s 14ms/step - loss: 9.2531 - PSNR: 28.3782 - val_loss: 8.8627 - val_PSNR: 27.0268
Epoch 5/100
200/200 [==============================] - 3s 14ms/step - loss: 8.7784 - PSNR: 28.0788 - val_loss: 8.5706 - val_PSNR: 27.3388
Epoch 6/100
200/200 [==============================] - 3s 15ms/step - loss: 8.7196 - PSNR: 28.9939 - val_loss: 8.2113 - val_PSNR: 25.9884
Epoch 7/100
200/200 [==============================] - 3s 16ms/step - loss: 8.3032 - PSNR: 29.2523 - val_loss: 8.1629 - val_PSNR: 28.6100
Epoch 8/100
200/200 [==============================] - 3s 15ms/step - loss: 8.4159 - PSNR: 29.2803 - val_loss: 9.1476 - val_PSNR: 25.7118
Epoch 9/100
200/200 [==============================] - 3s 15ms/step - loss: 8.5258 - PSNR: 29.2039 - val_loss: 8.3977 - val_PSNR: 25.9473
Epoch 10/100
200/200 [==============================] - 3s 14ms/step - loss: 7.9809 - PSNR: 29.7655 - val_loss: 8.3805 - val_PSNR: 28.8989
Epoch 11/100
200/200 [==============================] - 3s 14ms/step - loss: 7.9020 - PSNR: 30.3885 - val_loss: 7.3717 - val_PSNR: 30.3592
Epoch 12/100
200/200 [==============================] - 3s 13ms/step - loss: 8.2258 - PSNR: 30.4368 - val_loss: 8.4881 - val_PSNR: 30.0992
Epoch 13/100
200/200 [==============================] - 3s 14ms/step - loss: 8.0684 - PSNR: 29.5731 - val_loss: 8.4474 - val_PSNR: 29.3011
Epoch 14/100
200/200 [==============================] - 3s 14ms/step - loss: 7.9466 - PSNR: 31.0765 - val_loss: 8.1879 - val_PSNR: 30.2720
Epoch 15/100
200/200 [==============================] - 3s 14ms/step - loss: 7.8628 - PSNR: 30.3140 - val_loss: 7.6130 - val_PSNR: 27.1363
Epoch 16/100
200/200 [==============================] - 3s 14ms/step - loss: 7.8999 - PSNR: 30.5306 - val_loss: 7.9669 - val_PSNR: 32.0136
Epoch 17/100
200/200 [==============================] - 3s 15ms/step - loss: 7.7889 - PSNR: 30.6703 - val_loss: 8.3260 - val_PSNR: 28.5700
Epoch 18/100
200/200 [==============================] - 3s 16ms/step - loss: 7.8084 - PSNR: 30.5749 - val_loss: 8.4708 - val_PSNR: 26.8700
Epoch 19/100
200/200 [==============================] - 3s 15ms/step - loss: 7.7024 - PSNR: 30.5535 - val_loss: 7.7492 - val_PSNR: 28.0154
Epoch 20/100
200/200 [==============================] - 3s 14ms/step - loss: 7.8762 - PSNR: 30.5456 - val_loss: 8.4699 - val_PSNR: 28.4599
Epoch 21/100
200/200 [==============================] - 3s 15ms/step - loss: 7.7294 - PSNR: 31.9733 - val_loss: 7.6593 - val_PSNR: 30.1452
Epoch 22/100
200/200 [==============================] - 3s 14ms/step - loss: 7.7135 - PSNR: 30.6450 - val_loss: 6.9965 - val_PSNR: 31.9706
Epoch 23/100
200/200 [==============================] - 3s 14ms/step - loss: 7.5771 - PSNR: 30.3776 - val_loss: 7.2930 - val_PSNR: 29.7878
Epoch 24/100
200/200 [==============================] - 3s 14ms/step - loss: 7.3858 - PSNR: 31.3204 - val_loss: 7.7236 - val_PSNR: 29.1121
Epoch 25/100
200/200 [==============================] - 3s 14ms/step - loss: 7.6594 - PSNR: 30.7773 - val_loss: 8.1619 - val_PSNR: 26.2872
Epoch 26/100
200/200 [==============================] - 3s 14ms/step - loss: 7.5092 - PSNR: 30.7889 - val_loss: 8.3751 - val_PSNR: 29.6786
Epoch 27/100
200/200 [==============================] - 3s 14ms/step - loss: 7.5692 - PSNR: 31.0233 - val_loss: 6.8271 - val_PSNR: 34.1893
Epoch 28/100
200/200 [==============================] - 3s 15ms/step - loss: 7.3596 - PSNR: 32.1568 - val_loss: 7.6855 - val_PSNR: 30.0937
Epoch 29/100
200/200 [==============================] - 3s 16ms/step - loss: 7.3828 - PSNR: 31.6762 - val_loss: 7.3621 - val_PSNR: 30.9527
Epoch 30/100
200/200 [==============================] - 3s 15ms/step - loss: 7.5838 - PSNR: 31.5626 - val_loss: 7.9217 - val_PSNR: 29.4647
Epoch 31/100
200/200 [==============================] - 3s 13ms/step - loss: 7.3999 - PSNR: 31.7114 - val_loss: 7.2840 - val_PSNR: 32.0435
Epoch 32/100
200/200 [==============================] - 3s 15ms/step - loss: 7.3858 - PSNR: 32.8433 - val_loss: 7.1396 - val_PSNR: 27.7587
Epoch 33/100
200/200 [==============================] - 3s 14ms/step - loss: 7.5131 - PSNR: 31.9065 - val_loss: 6.7779 - val_PSNR: 29.0516
Epoch 34/100
200/200 [==============================] - 3s 14ms/step - loss: 7.3939 - PSNR: 31.7528 - val_loss: 7.5701 - val_PSNR: 31.3748
Epoch 35/100
200/200 [==============================] - 3s 14ms/step - loss: 7.3727 - PSNR: 32.0708 - val_loss: 7.7892 - val_PSNR: 32.0866
Epoch 36/100
200/200 [==============================] - 3s 15ms/step - loss: 7.3253 - PSNR: 31.6111 - val_loss: 8.3279 - val_PSNR: 29.1839
Epoch 37/100
200/200 [==============================] - 3s 14ms/step - loss: 7.3475 - PSNR: 31.3984 - val_loss: 8.1773 - val_PSNR: 30.4604
Epoch 38/100
200/200 [==============================] - 3s 14ms/step - loss: 7.4718 - PSNR: 31.6775 - val_loss: 7.9563 - val_PSNR: 27.9124
Epoch 39/100
200/200 [==============================] - 3s 14ms/step - loss: 7.3639 - PSNR: 31.6250 - val_loss: 7.6081 - val_PSNR: 25.7692
Epoch 40/100
200/200 [==============================] - 3s 17ms/step - loss: 7.4586 - PSNR: 31.7421 - val_loss: 7.1788 - val_PSNR: 32.0922
Epoch 41/100
200/200 [==============================] - 3s 15ms/step - loss: 7.4375 - PSNR: 31.1866 - val_loss: 7.9349 - val_PSNR: 32.7080
Epoch 42/100
200/200 [==============================] - 3s 14ms/step - loss: 7.3006 - PSNR: 31.9034 - val_loss: 6.8506 - val_PSNR: 32.9287
Epoch 43/100
200/200 [==============================] - 3s 14ms/step - loss: 7.5636 - PSNR: 31.5184 - val_loss: 7.5731 - val_PSNR: 30.1953
Epoch 44/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2326 - PSNR: 32.8681 - val_loss: 7.7718 - val_PSNR: 31.5258
Epoch 45/100
200/200 [==============================] - 3s 14ms/step - loss: 7.5296 - PSNR: 30.9189 - val_loss: 7.1726 - val_PSNR: 31.9731
Epoch 46/100
200/200 [==============================] - 3s 14ms/step - loss: 7.2767 - PSNR: 32.4489 - val_loss: 8.1406 - val_PSNR: 30.2914
Epoch 47/100
200/200 [==============================] - 3s 14ms/step - loss: 7.4189 - PSNR: 31.6966 - val_loss: 8.3045 - val_PSNR: 33.7047
Epoch 48/100
200/200 [==============================] - 3s 14ms/step - loss: 7.3769 - PSNR: 31.9412 - val_loss: 7.5024 - val_PSNR: 33.4558
Epoch 49/100
200/200 [==============================] - 3s 14ms/step - loss: 7.3036 - PSNR: 31.3715 - val_loss: 7.9106 - val_PSNR: 34.4464
Epoch 50/100
200/200 [==============================] - 3s 14ms/step - loss: 7.2507 - PSNR: 32.3822 - val_loss: 7.6090 - val_PSNR: 30.2165
Epoch 51/100
200/200 [==============================] - 3s 17ms/step - loss: 7.3284 - PSNR: 31.6690 - val_loss: 7.7522 - val_PSNR: 30.5372
Epoch 52/100
200/200 [==============================] - 3s 15ms/step - loss: 7.4655 - PSNR: 31.3171 - val_loss: 7.3817 - val_PSNR: 28.7950
Epoch 53/100
200/200 [==============================] - 3s 14ms/step - loss: 7.2009 - PSNR: 32.5989 - val_loss: 6.7343 - val_PSNR: 28.0336
Epoch 54/100
200/200 [==============================] - 3s 14ms/step - loss: 7.1439 - PSNR: 33.0634 - val_loss: 8.1054 - val_PSNR: 28.0920
Epoch 55/100
200/200 [==============================] - 3s 14ms/step - loss: 7.3491 - PSNR: 33.1126 - val_loss: 8.0159 - val_PSNR: 31.3952
Epoch 56/100
200/200 [==============================] - 3s 14ms/step - loss: 7.3924 - PSNR: 32.6249 - val_loss: 7.5059 - val_PSNR: 29.1734
Epoch 57/100
200/200 [==============================] - 3s 14ms/step - loss: 7.2920 - PSNR: 31.9586 - val_loss: 7.1865 - val_PSNR: 26.2483
Epoch 58/100
200/200 [==============================] - 3s 14ms/step - loss: 7.3228 - PSNR: 31.5449 - val_loss: 6.7273 - val_PSNR: 31.8893
Epoch 59/100
200/200 [==============================] - 3s 15ms/step - loss: 7.3941 - PSNR: 32.5168 - val_loss: 8.0442 - val_PSNR: 30.1575
Epoch 60/100
200/200 [==============================] - 3s 14ms/step - loss: 7.2805 - PSNR: 32.2614 - val_loss: 7.7009 - val_PSNR: 33.3516
Epoch 61/100
200/200 [==============================] - 3s 14ms/step - loss: 7.1750 - PSNR: 32.9884 - val_loss: 7.1154 - val_PSNR: 32.9347
Epoch 62/100
200/200 [==============================] - 3s 16ms/step - loss: 7.2296 - PSNR: 32.6826 - val_loss: 8.0734 - val_PSNR: 27.8480
Epoch 63/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2012 - PSNR: 31.1990 - val_loss: 7.3701 - val_PSNR: 29.2844
Epoch 64/100
200/200 [==============================] - 3s 14ms/step - loss: 7.2205 - PSNR: 32.3124 - val_loss: 7.0589 - val_PSNR: 28.2314
Epoch 65/100
200/200 [==============================] - 3s 14ms/step - loss: 7.0611 - PSNR: 31.4840 - val_loss: 8.0986 - val_PSNR: 30.5952
Epoch 66/100
200/200 [==============================] - 3s 14ms/step - loss: 7.3359 - PSNR: 31.9840 - val_loss: 7.1298 - val_PSNR: 32.1800
Epoch 67/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2695 - PSNR: 32.9859 - val_loss: 7.3108 - val_PSNR: 31.9192
Epoch 68/100
200/200 [==============================] - 3s 14ms/step - loss: 7.1450 - PSNR: 32.1345 - val_loss: 6.6878 - val_PSNR: 30.5519
Epoch 69/100
200/200 [==============================] - 3s 14ms/step - loss: 7.2115 - PSNR: 31.8601 - val_loss: 7.0704 - val_PSNR: 30.5323
Epoch 70/100
200/200 [==============================] - 3s 14ms/step - loss: 7.1791 - PSNR: 31.8223 - val_loss: 6.9826 - val_PSNR: 27.9944
Epoch 71/100
200/200 [==============================] - 3s 14ms/step - loss: 7.1885 - PSNR: 32.0171 - val_loss: 7.0414 - val_PSNR: 31.8703
Epoch 72/100
200/200 [==============================] - 3s 14ms/step - loss: 7.3008 - PSNR: 33.0246 - val_loss: 7.3282 - val_PSNR: 30.4377
Epoch 73/100
200/200 [==============================] - 3s 16ms/step - loss: 7.2214 - PSNR: 31.9020 - val_loss: 6.9890 - val_PSNR: 31.8796
Epoch 74/100
200/200 [==============================] - 3s 15ms/step - loss: 7.1839 - PSNR: 33.2237 - val_loss: 6.5868 - val_PSNR: 31.2789
Epoch 75/100
200/200 [==============================] - 3s 15ms/step - loss: 7.0844 - PSNR: 31.7623 - val_loss: 7.9182 - val_PSNR: 28.8939
Epoch 76/100
200/200 [==============================] - 3s 14ms/step - loss: 7.2719 - PSNR: 31.4579 - val_loss: 7.3036 - val_PSNR: 28.0535
Epoch 77/100
200/200 [==============================] - 3s 14ms/step - loss: 7.1315 - PSNR: 32.9078 - val_loss: 6.3473 - val_PSNR: 31.4398
Epoch 78/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2049 - PSNR: 31.8331 - val_loss: 7.1885 - val_PSNR: 31.9028
Epoch 79/100
200/200 [==============================] - 3s 14ms/step - loss: 7.2785 - PSNR: 32.6488 - val_loss: 7.8721 - val_PSNR: 32.6574
Epoch 80/100
200/200 [==============================] - 3s 14ms/step - loss: 7.0614 - PSNR: 32.0479 - val_loss: 7.0978 - val_PSNR: 32.2979
Epoch 81/100
200/200 [==============================] - 3s 14ms/step - loss: 7.0401 - PSNR: 33.3604 - val_loss: 6.5817 - val_PSNR: 35.0351
Epoch 82/100
200/200 [==============================] - 3s 15ms/step - loss: 7.2567 - PSNR: 32.3051 - val_loss: 6.8366 - val_PSNR: 33.8140
Epoch 83/100
200/200 [==============================] - 3s 16ms/step - loss: 7.1195 - PSNR: 31.6799 - val_loss: 6.7351 - val_PSNR: 31.5678
Epoch 84/100
200/200 [==============================] - 3s 14ms/step - loss: 7.0838 - PSNR: 32.9968 - val_loss: 6.8786 - val_PSNR: 31.0021
Epoch 85/100
200/200 [==============================] - 3s 14ms/step - loss: 7.2076 - PSNR: 32.5142 - val_loss: 6.5069 - val_PSNR: 29.5797
Epoch 86/100
200/200 [==============================] - 3s 16ms/step - loss: 7.2009 - PSNR: 32.0901 - val_loss: 7.1922 - val_PSNR: 34.3753
Epoch 87/100
200/200 [==============================] - 3s 14ms/step - loss: 7.1453 - PSNR: 31.9015 - val_loss: 7.6506 - val_PSNR: 32.7161
Epoch 88/100
200/200 [==============================] - 3s 14ms/step - loss: 7.1354 - PSNR: 32.3530 - val_loss: 7.2652 - val_PSNR: 30.5757
Epoch 89/100
200/200 [==============================] - 3s 14ms/step - loss: 7.1998 - PSNR: 32.0576 - val_loss: 6.6434 - val_PSNR: 33.9764
Epoch 90/100
200/200 [==============================] - 3s 15ms/step - loss: 7.1675 - PSNR: 32.3119 - val_loss: 7.2781 - val_PSNR: 31.9898
Epoch 91/100
200/200 [==============================] - 3s 14ms/step - loss: 7.0893 - PSNR: 32.7028 - val_loss: 7.3737 - val_PSNR: 34.4476
Epoch 92/100
200/200 [==============================] - 3s 14ms/step - loss: 7.0790 - PSNR: 31.4911 - val_loss: 6.7780 - val_PSNR: 32.1392
Epoch 93/100
200/200 [==============================] - 3s 14ms/step - loss: 7.0485 - PSNR: 31.6462 - val_loss: 6.8930 - val_PSNR: 29.5546
Epoch 94/100
200/200 [==============================] - 3s 16ms/step - loss: 7.1511 - PSNR: 32.5379 - val_loss: 7.5034 - val_PSNR: 29.0151
Epoch 95/100
200/200 [==============================] - 3s 14ms/step - loss: 7.1365 - PSNR: 33.1045 - val_loss: 7.6059 - val_PSNR: 32.7081
Epoch 96/100
200/200 [==============================] - 3s 14ms/step - loss: 7.1682 - PSNR: 31.5847 - val_loss: 6.7329 - val_PSNR: 29.7949
Epoch 97/100
200/200 [==============================] - 3s 16ms/step - loss: 7.0104 - PSNR: 32.4609 - val_loss: 6.9102 - val_PSNR: 30.6054
Epoch 98/100
200/200 [==============================] - 3s 14ms/step - loss: 7.1354 - PSNR: 31.6418 - val_loss: 7.0505 - val_PSNR: 28.2734
Epoch 99/100
200/200 [==============================] - 3s 14ms/step - loss: 7.1608 - PSNR: 32.2478 - val_loss: 7.1864 - val_PSNR: 33.0302
Epoch 100/100
200/200 [==============================] - 3s 14ms/step - loss: 7.0654 - PSNR: 31.6753 - val_loss: 7.8460 - val_PSNR: 29.3488
Training time: 292.40515065193176
Train with Residual Blocks: 16 and Upsample method: Conv2DTranspose
Model: "edsr_model_3"
__________________________________________________________________________________________________
 Layer (type)                   Output Shape         Param #     Connected to                     
==================================================================================================
 input_4 (InputLayer)           [(None, None, None,  0           []                               
                                 3)]                                                              
                                                                                                  
 rescaling_6 (Rescaling)        (None, None, None,   0           ['input_4[0][0]']                
                                3)                                                                
                                                                                                  
 conv2d_23 (Conv2D)             (None, None, None,   1792        ['rescaling_6[0][0]']            
                                64)                                                               
                                                                                                  
 conv2d_24 (Conv2D)             (None, None, None,   36928       ['conv2d_23[0][0]']              
                                64)                                                               
                                                                                                  
 conv2d_25 (Conv2D)             (None, None, None,   36928       ['conv2d_24[0][0]']              
                                64)                                                               
                                                                                                  
 add_9 (Add)                    (None, None, None,   0           ['conv2d_23[0][0]',              
                                64)                               'conv2d_25[0][0]']              
                                                                                                  
 conv2d_26 (Conv2D)             (None, None, None,   36928       ['add_9[0][0]']                  
                                64)                                                               
                                                                                                  
 conv2d_27 (Conv2D)             (None, None, None,   36928       ['conv2d_26[0][0]']              
                                64)                                                               
                                                                                                  
 add_10 (Add)                   (None, None, None,   0           ['add_9[0][0]',                  
                                64)                               'conv2d_27[0][0]']              
                                                                                                  
 conv2d_28 (Conv2D)             (None, None, None,   36928       ['add_10[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_29 (Conv2D)             (None, None, None,   36928       ['conv2d_28[0][0]']              
                                64)                                                               
                                                                                                  
 add_11 (Add)                   (None, None, None,   0           ['add_10[0][0]',                 
                                64)                               'conv2d_29[0][0]']              
                                                                                                  
 conv2d_30 (Conv2D)             (None, None, None,   36928       ['add_11[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_31 (Conv2D)             (None, None, None,   36928       ['conv2d_30[0][0]']              
                                64)                                                               
                                                                                                  
 add_12 (Add)                   (None, None, None,   0           ['add_11[0][0]',                 
                                64)                               'conv2d_31[0][0]']              
                                                                                                  
 conv2d_32 (Conv2D)             (None, None, None,   36928       ['add_12[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_33 (Conv2D)             (None, None, None,   36928       ['conv2d_32[0][0]']              
                                64)                                                               
                                                                                                  
 add_13 (Add)                   (None, None, None,   0           ['add_12[0][0]',                 
                                64)                               'conv2d_33[0][0]']              
                                                                                                  
 conv2d_34 (Conv2D)             (None, None, None,   36928       ['add_13[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_35 (Conv2D)             (None, None, None,   36928       ['conv2d_34[0][0]']              
                                64)                                                               
                                                                                                  
 add_14 (Add)                   (None, None, None,   0           ['add_13[0][0]',                 
                                64)                               'conv2d_35[0][0]']              
                                                                                                  
 conv2d_36 (Conv2D)             (None, None, None,   36928       ['add_14[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_37 (Conv2D)             (None, None, None,   36928       ['conv2d_36[0][0]']              
                                64)                                                               
                                                                                                  
 add_15 (Add)                   (None, None, None,   0           ['add_14[0][0]',                 
                                64)                               'conv2d_37[0][0]']              
                                                                                                  
 conv2d_38 (Conv2D)             (None, None, None,   36928       ['add_15[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_39 (Conv2D)             (None, None, None,   36928       ['conv2d_38[0][0]']              
                                64)                                                               
                                                                                                  
 add_16 (Add)                   (None, None, None,   0           ['add_15[0][0]',                 
                                64)                               'conv2d_39[0][0]']              
                                                                                                  
 conv2d_40 (Conv2D)             (None, None, None,   36928       ['add_16[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_41 (Conv2D)             (None, None, None,   36928       ['conv2d_40[0][0]']              
                                64)                                                               
                                                                                                  
 add_17 (Add)                   (None, None, None,   0           ['add_16[0][0]',                 
                                64)                               'conv2d_41[0][0]']              
                                                                                                  
 conv2d_42 (Conv2D)             (None, None, None,   36928       ['add_17[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_43 (Conv2D)             (None, None, None,   36928       ['conv2d_42[0][0]']              
                                64)                                                               
                                                                                                  
 add_18 (Add)                   (None, None, None,   0           ['add_17[0][0]',                 
                                64)                               'conv2d_43[0][0]']              
                                                                                                  
 conv2d_44 (Conv2D)             (None, None, None,   36928       ['add_18[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_45 (Conv2D)             (None, None, None,   36928       ['conv2d_44[0][0]']              
                                64)                                                               
                                                                                                  
 add_19 (Add)                   (None, None, None,   0           ['add_18[0][0]',                 
                                64)                               'conv2d_45[0][0]']              
                                                                                                  
 conv2d_46 (Conv2D)             (None, None, None,   36928       ['add_19[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_47 (Conv2D)             (None, None, None,   36928       ['conv2d_46[0][0]']              
                                64)                                                               
                                                                                                  
 add_20 (Add)                   (None, None, None,   0           ['add_19[0][0]',                 
                                64)                               'conv2d_47[0][0]']              
                                                                                                  
 conv2d_48 (Conv2D)             (None, None, None,   36928       ['add_20[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_49 (Conv2D)             (None, None, None,   36928       ['conv2d_48[0][0]']              
                                64)                                                               
                                                                                                  
 add_21 (Add)                   (None, None, None,   0           ['add_20[0][0]',                 
                                64)                               'conv2d_49[0][0]']              
                                                                                                  
 conv2d_50 (Conv2D)             (None, None, None,   36928       ['add_21[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_51 (Conv2D)             (None, None, None,   36928       ['conv2d_50[0][0]']              
                                64)                                                               
                                                                                                  
 add_22 (Add)                   (None, None, None,   0           ['add_21[0][0]',                 
                                64)                               'conv2d_51[0][0]']              
                                                                                                  
 conv2d_52 (Conv2D)             (None, None, None,   36928       ['add_22[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_53 (Conv2D)             (None, None, None,   36928       ['conv2d_52[0][0]']              
                                64)                                                               
                                                                                                  
 add_23 (Add)                   (None, None, None,   0           ['add_22[0][0]',                 
                                64)                               'conv2d_53[0][0]']              
                                                                                                  
 conv2d_54 (Conv2D)             (None, None, None,   36928       ['add_23[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_55 (Conv2D)             (None, None, None,   36928       ['conv2d_54[0][0]']              
                                64)                                                               
                                                                                                  
 add_24 (Add)                   (None, None, None,   0           ['add_23[0][0]',                 
                                64)                               'conv2d_55[0][0]']              
                                                                                                  
 conv2d_56 (Conv2D)             (None, None, None,   36928       ['add_24[0][0]']                 
                                64)                                                               
                                                                                                  
 add_25 (Add)                   (None, None, None,   0           ['conv2d_23[0][0]',              
                                64)                               'conv2d_56[0][0]']              
                                                                                                  
 conv2d_transpose_2 (Conv2DTran  (None, None, None,   36928      ['add_25[0][0]']                 
 spose)                         64)                                                               
                                                                                                  
 conv2d_transpose_3 (Conv2DTran  (None, None, None,   36928      ['conv2d_transpose_2[0][0]']     
 spose)                         64)                                                               
                                                                                                  
 conv2d_57 (Conv2D)             (None, None, None,   1731        ['conv2d_transpose_3[0][0]']     
                                3)                                                                
                                                                                                  
 rescaling_7 (Rescaling)        (None, None, None,   0           ['conv2d_57[0][0]']              
                                3)                                                                
                                                                                                  
==================================================================================================
Total params: 1,296,003
Trainable params: 1,296,003
Non-trainable params: 0
__________________________________________________________________________________________________
Epoch 1/100
200/200 [==============================] - 18s 35ms/step - loss: 27.7585 - PSNR: 18.6042 - val_loss: 14.4214 - val_PSNR: 21.6992
Epoch 2/100
200/200 [==============================] - 6s 29ms/step - loss: 13.4037 - PSNR: 24.6047 - val_loss: 12.3553 - val_PSNR: 22.4603
Epoch 3/100
200/200 [==============================] - 6s 30ms/step - loss: 11.5217 - PSNR: 26.4089 - val_loss: 9.7268 - val_PSNR: 26.1713
Epoch 4/100
200/200 [==============================] - 5s 27ms/step - loss: 10.3516 - PSNR: 27.4343 - val_loss: 11.0647 - val_PSNR: 24.7771
Epoch 5/100
200/200 [==============================] - 6s 31ms/step - loss: 10.2707 - PSNR: 27.6752 - val_loss: 10.1297 - val_PSNR: 27.0397
Epoch 6/100
200/200 [==============================] - 6s 29ms/step - loss: 9.4021 - PSNR: 28.0402 - val_loss: 9.4148 - val_PSNR: 28.4283
Epoch 7/100
200/200 [==============================] - 6s 31ms/step - loss: 9.2853 - PSNR: 28.6266 - val_loss: 8.9280 - val_PSNR: 27.9836
Epoch 8/100
200/200 [==============================] - 5s 27ms/step - loss: 9.1427 - PSNR: 29.0787 - val_loss: 9.6806 - val_PSNR: 26.3127
Epoch 9/100
200/200 [==============================] - 6s 29ms/step - loss: 8.9918 - PSNR: 29.0825 - val_loss: 9.8323 - val_PSNR: 24.8713
Epoch 10/100
200/200 [==============================] - 6s 31ms/step - loss: 8.9246 - PSNR: 29.3340 - val_loss: 8.9960 - val_PSNR: 29.9751
Epoch 11/100
200/200 [==============================] - 6s 29ms/step - loss: 8.6895 - PSNR: 29.1636 - val_loss: 8.3293 - val_PSNR: 27.3292
Epoch 12/100
200/200 [==============================] - 6s 30ms/step - loss: 8.5942 - PSNR: 29.3293 - val_loss: 7.8372 - val_PSNR: 28.4572
Epoch 13/100
200/200 [==============================] - 6s 28ms/step - loss: 8.2268 - PSNR: 30.4978 - val_loss: 7.8768 - val_PSNR: 26.2750
Epoch 14/100
200/200 [==============================] - 6s 30ms/step - loss: 8.4917 - PSNR: 29.5750 - val_loss: 7.4263 - val_PSNR: 30.0816
Epoch 15/100
200/200 [==============================] - 6s 28ms/step - loss: 8.3405 - PSNR: 30.6572 - val_loss: 8.9186 - val_PSNR: 26.1167
Epoch 16/100
200/200 [==============================] - 6s 30ms/step - loss: 8.0499 - PSNR: 30.6080 - val_loss: 8.6048 - val_PSNR: 29.7102
Epoch 17/100
200/200 [==============================] - 6s 28ms/step - loss: 8.0661 - PSNR: 30.7900 - val_loss: 8.4033 - val_PSNR: 28.6747
Epoch 18/100
200/200 [==============================] - 6s 31ms/step - loss: 8.0943 - PSNR: 30.6168 - val_loss: 7.4278 - val_PSNR: 34.3799
Epoch 19/100
200/200 [==============================] - 6s 28ms/step - loss: 8.1856 - PSNR: 30.2466 - val_loss: 7.6615 - val_PSNR: 30.8135
Epoch 20/100
200/200 [==============================] - 6s 28ms/step - loss: 7.9866 - PSNR: 31.2392 - val_loss: 8.1658 - val_PSNR: 26.9898
Epoch 21/100
200/200 [==============================] - 6s 30ms/step - loss: 8.0154 - PSNR: 30.3042 - val_loss: 7.3804 - val_PSNR: 32.4653
Epoch 22/100
200/200 [==============================] - 6s 28ms/step - loss: 7.9294 - PSNR: 31.0380 - val_loss: 7.4959 - val_PSNR: 31.6419
Epoch 23/100
200/200 [==============================] - 6s 30ms/step - loss: 8.1841 - PSNR: 30.0814 - val_loss: 7.9071 - val_PSNR: 35.2032
Epoch 24/100
200/200 [==============================] - 6s 28ms/step - loss: 7.7503 - PSNR: 31.1814 - val_loss: 7.4072 - val_PSNR: 26.2578
Epoch 25/100
200/200 [==============================] - 6s 28ms/step - loss: 7.7024 - PSNR: 30.5286 - val_loss: 6.9916 - val_PSNR: 29.3408
Epoch 26/100
200/200 [==============================] - 6s 30ms/step - loss: 7.5369 - PSNR: 30.7396 - val_loss: 8.2668 - val_PSNR: 29.3047
Epoch 27/100
200/200 [==============================] - 6s 28ms/step - loss: 7.5320 - PSNR: 31.1796 - val_loss: 6.9909 - val_PSNR: 31.2073
Epoch 28/100
200/200 [==============================] - 6s 29ms/step - loss: 7.3867 - PSNR: 31.6524 - val_loss: 7.7802 - val_PSNR: 29.9163
Epoch 29/100
200/200 [==============================] - 6s 29ms/step - loss: 7.3866 - PSNR: 31.6775 - val_loss: 7.8736 - val_PSNR: 30.3131
Epoch 30/100
200/200 [==============================] - 6s 28ms/step - loss: 7.6456 - PSNR: 31.5262 - val_loss: 7.3670 - val_PSNR: 30.0080
Epoch 31/100
200/200 [==============================] - 6s 28ms/step - loss: 7.5588 - PSNR: 31.6155 - val_loss: 7.0184 - val_PSNR: 34.1210
Epoch 32/100
200/200 [==============================] - 6s 30ms/step - loss: 7.4592 - PSNR: 31.1254 - val_loss: 7.4870 - val_PSNR: 30.7247
Epoch 33/100
200/200 [==============================] - 5s 27ms/step - loss: 7.6126 - PSNR: 30.8605 - val_loss: 7.7996 - val_PSNR: 33.2647
Epoch 34/100
200/200 [==============================] - 6s 29ms/step - loss: 7.4852 - PSNR: 31.2046 - val_loss: 8.1698 - val_PSNR: 29.6269
Epoch 35/100
200/200 [==============================] - 6s 30ms/step - loss: 7.3162 - PSNR: 31.1209 - val_loss: 7.8451 - val_PSNR: 31.6448
Epoch 36/100
200/200 [==============================] - 5s 27ms/step - loss: 7.3192 - PSNR: 32.4286 - val_loss: 7.3996 - val_PSNR: 26.5538
Epoch 37/100
200/200 [==============================] - 6s 31ms/step - loss: 7.2762 - PSNR: 31.5503 - val_loss: 7.4264 - val_PSNR: 31.4539
Epoch 38/100
200/200 [==============================] - 6s 28ms/step - loss: 7.0703 - PSNR: 32.5896 - val_loss: 7.2558 - val_PSNR: 34.1297
Epoch 39/100
200/200 [==============================] - 6s 29ms/step - loss: 7.4220 - PSNR: 31.4827 - val_loss: 6.5384 - val_PSNR: 32.9082
Epoch 40/100
200/200 [==============================] - 6s 29ms/step - loss: 7.3187 - PSNR: 30.9510 - val_loss: 7.4063 - val_PSNR: 27.4890
Epoch 41/100
200/200 [==============================] - 6s 29ms/step - loss: 7.4870 - PSNR: 31.3210 - val_loss: 7.4020 - val_PSNR: 31.2964
Epoch 42/100
200/200 [==============================] - 5s 27ms/step - loss: 7.3706 - PSNR: 32.4870 - val_loss: 7.2702 - val_PSNR: 28.6105
Epoch 43/100
200/200 [==============================] - 6s 30ms/step - loss: 7.1139 - PSNR: 31.4998 - val_loss: 7.4461 - val_PSNR: 29.5535
Epoch 44/100
200/200 [==============================] - 6s 28ms/step - loss: 7.4108 - PSNR: 31.3439 - val_loss: 7.3426 - val_PSNR: 29.7280
Epoch 45/100
200/200 [==============================] - 6s 28ms/step - loss: 7.4200 - PSNR: 30.8727 - val_loss: 7.0008 - val_PSNR: 33.2316
Epoch 46/100
200/200 [==============================] - 6s 29ms/step - loss: 7.2504 - PSNR: 31.7083 - val_loss: 7.6241 - val_PSNR: 26.4707
Epoch 47/100
200/200 [==============================] - 6s 28ms/step - loss: 7.2827 - PSNR: 31.6426 - val_loss: 6.9638 - val_PSNR: 29.5057
Epoch 48/100
200/200 [==============================] - 6s 29ms/step - loss: 7.2540 - PSNR: 32.3318 - val_loss: 7.7884 - val_PSNR: 28.3812
Epoch 49/100
200/200 [==============================] - 6s 30ms/step - loss: 7.3120 - PSNR: 31.6338 - val_loss: 7.4774 - val_PSNR: 35.8825
Epoch 50/100
200/200 [==============================] - 6s 28ms/step - loss: 7.4906 - PSNR: 31.3568 - val_loss: 7.0632 - val_PSNR: 34.8070
Epoch 51/100
200/200 [==============================] - 6s 30ms/step - loss: 7.2021 - PSNR: 31.7188 - val_loss: 7.1128 - val_PSNR: 30.2057
Epoch 52/100
200/200 [==============================] - 6s 29ms/step - loss: 7.3213 - PSNR: 32.1367 - val_loss: 7.3315 - val_PSNR: 32.8360
Epoch 53/100
200/200 [==============================] - 6s 28ms/step - loss: 7.2746 - PSNR: 32.0393 - val_loss: 7.1581 - val_PSNR: 33.2517
Epoch 54/100
200/200 [==============================] - 6s 30ms/step - loss: 7.1720 - PSNR: 33.0238 - val_loss: 7.8597 - val_PSNR: 27.4799
Epoch 55/100
200/200 [==============================] - 6s 28ms/step - loss: 7.1792 - PSNR: 32.4961 - val_loss: 8.1411 - val_PSNR: 28.9605
Epoch 56/100
200/200 [==============================] - 6s 29ms/step - loss: 7.2938 - PSNR: 31.1547 - val_loss: 7.3577 - val_PSNR: 27.4208
Epoch 57/100
200/200 [==============================] - 6s 29ms/step - loss: 7.1370 - PSNR: 33.1147 - val_loss: 6.1355 - val_PSNR: 28.8502
Epoch 58/100
200/200 [==============================] - 6s 28ms/step - loss: 7.2961 - PSNR: 32.1824 - val_loss: 7.8447 - val_PSNR: 32.2654
Epoch 59/100
200/200 [==============================] - 6s 29ms/step - loss: 7.1345 - PSNR: 31.7046 - val_loss: 7.7828 - val_PSNR: 29.9260
Epoch 60/100
200/200 [==============================] - 6s 29ms/step - loss: 7.0958 - PSNR: 31.5460 - val_loss: 7.5331 - val_PSNR: 36.5297
Epoch 61/100
200/200 [==============================] - 6s 28ms/step - loss: 7.3619 - PSNR: 31.4198 - val_loss: 7.5253 - val_PSNR: 28.8052
Epoch 62/100
200/200 [==============================] - 6s 28ms/step - loss: 7.0975 - PSNR: 33.0975 - val_loss: 7.3569 - val_PSNR: 33.5877
Epoch 63/100
200/200 [==============================] - 6s 29ms/step - loss: 7.3543 - PSNR: 32.5728 - val_loss: 6.8683 - val_PSNR: 33.4465
Epoch 64/100
200/200 [==============================] - 6s 29ms/step - loss: 7.1647 - PSNR: 32.0781 - val_loss: 6.7196 - val_PSNR: 34.9208
Epoch 65/100
200/200 [==============================] - 6s 29ms/step - loss: 7.1807 - PSNR: 31.0676 - val_loss: 6.8687 - val_PSNR: 27.6641
Epoch 66/100
200/200 [==============================] - 6s 28ms/step - loss: 7.1425 - PSNR: 32.0895 - val_loss: 8.0050 - val_PSNR: 31.0471
Epoch 67/100
200/200 [==============================] - 6s 28ms/step - loss: 7.1520 - PSNR: 31.8699 - val_loss: 7.5228 - val_PSNR: 26.2802
Epoch 68/100
200/200 [==============================] - 6s 30ms/step - loss: 7.0310 - PSNR: 31.8166 - val_loss: 7.1801 - val_PSNR: 37.7283
Epoch 69/100
200/200 [==============================] - 6s 28ms/step - loss: 7.0807 - PSNR: 32.3654 - val_loss: 6.8095 - val_PSNR: 31.2476
Epoch 70/100
200/200 [==============================] - 6s 30ms/step - loss: 7.0308 - PSNR: 31.8837 - val_loss: 6.8889 - val_PSNR: 32.5382
Epoch 71/100
200/200 [==============================] - 6s 28ms/step - loss: 7.1538 - PSNR: 31.9252 - val_loss: 6.9397 - val_PSNR: 30.7882
Epoch 72/100
200/200 [==============================] - 6s 28ms/step - loss: 7.2134 - PSNR: 32.1782 - val_loss: 7.0367 - val_PSNR: 34.2921
Epoch 73/100
200/200 [==============================] - 6s 28ms/step - loss: 7.0529 - PSNR: 32.5049 - val_loss: 7.3083 - val_PSNR: 35.5279
Epoch 74/100
200/200 [==============================] - 6s 30ms/step - loss: 7.2309 - PSNR: 32.3911 - val_loss: 6.9757 - val_PSNR: 34.1820
Epoch 75/100
200/200 [==============================] - 6s 29ms/step - loss: 7.1462 - PSNR: 31.3715 - val_loss: 6.2669 - val_PSNR: 31.1540
Epoch 76/100
200/200 [==============================] - 6s 30ms/step - loss: 7.0851 - PSNR: 31.1408 - val_loss: 7.7505 - val_PSNR: 34.5820
Epoch 77/100
200/200 [==============================] - 6s 29ms/step - loss: 7.2372 - PSNR: 32.0315 - val_loss: 6.3560 - val_PSNR: 30.2261
Epoch 78/100
200/200 [==============================] - 6s 28ms/step - loss: 7.0430 - PSNR: 32.5781 - val_loss: 6.4512 - val_PSNR: 29.2954
Epoch 79/100
200/200 [==============================] - 6s 28ms/step - loss: 6.9509 - PSNR: 32.2366 - val_loss: 7.4811 - val_PSNR: 32.7448
Epoch 80/100
200/200 [==============================] - 6s 30ms/step - loss: 6.9370 - PSNR: 32.6088 - val_loss: 7.4800 - val_PSNR: 29.4825
Epoch 81/100
200/200 [==============================] - 6s 30ms/step - loss: 7.0940 - PSNR: 32.0280 - val_loss: 7.3737 - val_PSNR: 33.6980
Epoch 82/100
200/200 [==============================] - 5s 27ms/step - loss: 7.0983 - PSNR: 32.2970 - val_loss: 6.7717 - val_PSNR: 33.4296
Epoch 83/100
200/200 [==============================] - 6s 28ms/step - loss: 7.1983 - PSNR: 31.8209 - val_loss: 6.7153 - val_PSNR: 29.9430
Epoch 84/100
200/200 [==============================] - 5s 27ms/step - loss: 7.0848 - PSNR: 32.0996 - val_loss: 6.8991 - val_PSNR: 33.3069
Epoch 85/100
200/200 [==============================] - 6s 29ms/step - loss: 7.0981 - PSNR: 32.6938 - val_loss: 6.6619 - val_PSNR: 29.4661
Epoch 86/100
200/200 [==============================] - 6s 31ms/step - loss: 7.0807 - PSNR: 31.6211 - val_loss: 7.2031 - val_PSNR: 33.4928
Epoch 87/100
200/200 [==============================] - 6s 29ms/step - loss: 7.0449 - PSNR: 31.7733 - val_loss: 7.0595 - val_PSNR: 25.0514
Epoch 88/100
200/200 [==============================] - 6s 28ms/step - loss: 7.0268 - PSNR: 32.4524 - val_loss: 6.6986 - val_PSNR: 35.0220
Epoch 89/100
200/200 [==============================] - 6s 28ms/step - loss: 7.1375 - PSNR: 32.4135 - val_loss: 7.2706 - val_PSNR: 32.5165
Epoch 90/100
200/200 [==============================] - 6s 28ms/step - loss: 6.9452 - PSNR: 33.3333 - val_loss: 6.4050 - val_PSNR: 28.8537
Epoch 91/100
200/200 [==============================] - 6s 31ms/step - loss: 6.8539 - PSNR: 32.1334 - val_loss: 7.8148 - val_PSNR: 34.5253
Epoch 92/100
200/200 [==============================] - 6s 30ms/step - loss: 7.0235 - PSNR: 32.6531 - val_loss: 6.7267 - val_PSNR: 29.8156
Epoch 93/100
200/200 [==============================] - 6s 29ms/step - loss: 7.0329 - PSNR: 32.5876 - val_loss: 6.6561 - val_PSNR: 37.9333
Epoch 94/100
200/200 [==============================] - 5s 27ms/step - loss: 6.8629 - PSNR: 32.7550 - val_loss: 6.9915 - val_PSNR: 32.4213
Epoch 95/100
200/200 [==============================] - 6s 29ms/step - loss: 6.9922 - PSNR: 32.7152 - val_loss: 7.0446 - val_PSNR: 28.9388
Epoch 96/100
200/200 [==============================] - 6s 27ms/step - loss: 6.8747 - PSNR: 32.1347 - val_loss: 6.5663 - val_PSNR: 29.1013
Epoch 97/100
200/200 [==============================] - 7s 33ms/step - loss: 6.9574 - PSNR: 32.1491 - val_loss: 7.0975 - val_PSNR: 36.2528
Epoch 98/100
200/200 [==============================] - 6s 29ms/step - loss: 6.8989 - PSNR: 32.1577 - val_loss: 6.5798 - val_PSNR: 29.4363
Epoch 99/100
200/200 [==============================] - 5s 27ms/step - loss: 6.8543 - PSNR: 32.1862 - val_loss: 6.6247 - val_PSNR: 29.2380
Epoch 100/100
200/200 [==============================] - 6s 29ms/step - loss: 6.8985 - PSNR: 33.1180 - val_loss: 6.4186 - val_PSNR: 32.1822
Training time: 590.0031492710114
Train with Residual Blocks: 16 and Upsample method: UpSampling2D
Model: "edsr_model_4"
__________________________________________________________________________________________________
 Layer (type)                   Output Shape         Param #     Connected to                     
==================================================================================================
 input_5 (InputLayer)           [(None, None, None,  0           []                               
                                 3)]                                                              
                                                                                                  
 rescaling_8 (Rescaling)        (None, None, None,   0           ['input_5[0][0]']                
                                3)                                                                
                                                                                                  
 conv2d_58 (Conv2D)             (None, None, None,   1792        ['rescaling_8[0][0]']            
                                64)                                                               
                                                                                                  
 conv2d_59 (Conv2D)             (None, None, None,   36928       ['conv2d_58[0][0]']              
                                64)                                                               
                                                                                                  
 conv2d_60 (Conv2D)             (None, None, None,   36928       ['conv2d_59[0][0]']              
                                64)                                                               
                                                                                                  
 add_26 (Add)                   (None, None, None,   0           ['conv2d_58[0][0]',              
                                64)                               'conv2d_60[0][0]']              
                                                                                                  
 conv2d_61 (Conv2D)             (None, None, None,   36928       ['add_26[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_62 (Conv2D)             (None, None, None,   36928       ['conv2d_61[0][0]']              
                                64)                                                               
                                                                                                  
 add_27 (Add)                   (None, None, None,   0           ['add_26[0][0]',                 
                                64)                               'conv2d_62[0][0]']              
                                                                                                  
 conv2d_63 (Conv2D)             (None, None, None,   36928       ['add_27[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_64 (Conv2D)             (None, None, None,   36928       ['conv2d_63[0][0]']              
                                64)                                                               
                                                                                                  
 add_28 (Add)                   (None, None, None,   0           ['add_27[0][0]',                 
                                64)                               'conv2d_64[0][0]']              
                                                                                                  
 conv2d_65 (Conv2D)             (None, None, None,   36928       ['add_28[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_66 (Conv2D)             (None, None, None,   36928       ['conv2d_65[0][0]']              
                                64)                                                               
                                                                                                  
 add_29 (Add)                   (None, None, None,   0           ['add_28[0][0]',                 
                                64)                               'conv2d_66[0][0]']              
                                                                                                  
 conv2d_67 (Conv2D)             (None, None, None,   36928       ['add_29[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_68 (Conv2D)             (None, None, None,   36928       ['conv2d_67[0][0]']              
                                64)                                                               
                                                                                                  
 add_30 (Add)                   (None, None, None,   0           ['add_29[0][0]',                 
                                64)                               'conv2d_68[0][0]']              
                                                                                                  
 conv2d_69 (Conv2D)             (None, None, None,   36928       ['add_30[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_70 (Conv2D)             (None, None, None,   36928       ['conv2d_69[0][0]']              
                                64)                                                               
                                                                                                  
 add_31 (Add)                   (None, None, None,   0           ['add_30[0][0]',                 
                                64)                               'conv2d_70[0][0]']              
                                                                                                  
 conv2d_71 (Conv2D)             (None, None, None,   36928       ['add_31[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_72 (Conv2D)             (None, None, None,   36928       ['conv2d_71[0][0]']              
                                64)                                                               
                                                                                                  
 add_32 (Add)                   (None, None, None,   0           ['add_31[0][0]',                 
                                64)                               'conv2d_72[0][0]']              
                                                                                                  
 conv2d_73 (Conv2D)             (None, None, None,   36928       ['add_32[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_74 (Conv2D)             (None, None, None,   36928       ['conv2d_73[0][0]']              
                                64)                                                               
                                                                                                  
 add_33 (Add)                   (None, None, None,   0           ['add_32[0][0]',                 
                                64)                               'conv2d_74[0][0]']              
                                                                                                  
 conv2d_75 (Conv2D)             (None, None, None,   36928       ['add_33[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_76 (Conv2D)             (None, None, None,   36928       ['conv2d_75[0][0]']              
                                64)                                                               
                                                                                                  
 add_34 (Add)                   (None, None, None,   0           ['add_33[0][0]',                 
                                64)                               'conv2d_76[0][0]']              
                                                                                                  
 conv2d_77 (Conv2D)             (None, None, None,   36928       ['add_34[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_78 (Conv2D)             (None, None, None,   36928       ['conv2d_77[0][0]']              
                                64)                                                               
                                                                                                  
 add_35 (Add)                   (None, None, None,   0           ['add_34[0][0]',                 
                                64)                               'conv2d_78[0][0]']              
                                                                                                  
 conv2d_79 (Conv2D)             (None, None, None,   36928       ['add_35[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_80 (Conv2D)             (None, None, None,   36928       ['conv2d_79[0][0]']              
                                64)                                                               
                                                                                                  
 add_36 (Add)                   (None, None, None,   0           ['add_35[0][0]',                 
                                64)                               'conv2d_80[0][0]']              
                                                                                                  
 conv2d_81 (Conv2D)             (None, None, None,   36928       ['add_36[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_82 (Conv2D)             (None, None, None,   36928       ['conv2d_81[0][0]']              
                                64)                                                               
                                                                                                  
 add_37 (Add)                   (None, None, None,   0           ['add_36[0][0]',                 
                                64)                               'conv2d_82[0][0]']              
                                                                                                  
 conv2d_83 (Conv2D)             (None, None, None,   36928       ['add_37[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_84 (Conv2D)             (None, None, None,   36928       ['conv2d_83[0][0]']              
                                64)                                                               
                                                                                                  
 add_38 (Add)                   (None, None, None,   0           ['add_37[0][0]',                 
                                64)                               'conv2d_84[0][0]']              
                                                                                                  
 conv2d_85 (Conv2D)             (None, None, None,   36928       ['add_38[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_86 (Conv2D)             (None, None, None,   36928       ['conv2d_85[0][0]']              
                                64)                                                               
                                                                                                  
 add_39 (Add)                   (None, None, None,   0           ['add_38[0][0]',                 
                                64)                               'conv2d_86[0][0]']              
                                                                                                  
 conv2d_87 (Conv2D)             (None, None, None,   36928       ['add_39[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_88 (Conv2D)             (None, None, None,   36928       ['conv2d_87[0][0]']              
                                64)                                                               
                                                                                                  
 add_40 (Add)                   (None, None, None,   0           ['add_39[0][0]',                 
                                64)                               'conv2d_88[0][0]']              
                                                                                                  
 conv2d_89 (Conv2D)             (None, None, None,   36928       ['add_40[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_90 (Conv2D)             (None, None, None,   36928       ['conv2d_89[0][0]']              
                                64)                                                               
                                                                                                  
 add_41 (Add)                   (None, None, None,   0           ['add_40[0][0]',                 
                                64)                               'conv2d_90[0][0]']              
                                                                                                  
 conv2d_91 (Conv2D)             (None, None, None,   36928       ['add_41[0][0]']                 
                                64)                                                               
                                                                                                  
 add_42 (Add)                   (None, None, None,   0           ['conv2d_58[0][0]',              
                                64)                               'conv2d_91[0][0]']              
                                                                                                  
 up_sampling2d_2 (UpSampling2D)  (None, None, None,   0          ['add_42[0][0]']                 
                                64)                                                               
                                                                                                  
 up_sampling2d_3 (UpSampling2D)  (None, None, None,   0          ['up_sampling2d_2[0][0]']        
                                64)                                                               
                                                                                                  
 conv2d_92 (Conv2D)             (None, None, None,   1731        ['up_sampling2d_3[0][0]']        
                                3)                                                                
                                                                                                  
 rescaling_9 (Rescaling)        (None, None, None,   0           ['conv2d_92[0][0]']              
                                3)                                                                
                                                                                                  
==================================================================================================
Total params: 1,222,147
Trainable params: 1,222,147
Non-trainable params: 0
__________________________________________________________________________________________________
Epoch 1/100
200/200 [==============================] - 17s 31ms/step - loss: 51.1347 - PSNR: 15.7638 - val_loss: 21.1856 - val_PSNR: 20.6225
Epoch 2/100
200/200 [==============================] - 5s 24ms/step - loss: 16.9510 - PSNR: 22.1069 - val_loss: 14.8004 - val_PSNR: 23.6160
Epoch 3/100
200/200 [==============================] - 5s 24ms/step - loss: 13.7195 - PSNR: 24.7569 - val_loss: 12.3262 - val_PSNR: 23.5361
Epoch 4/100
200/200 [==============================] - 5s 24ms/step - loss: 12.0817 - PSNR: 25.5904 - val_loss: 10.9015 - val_PSNR: 26.7984
Epoch 5/100
200/200 [==============================] - 5s 26ms/step - loss: 11.2346 - PSNR: 26.6826 - val_loss: 11.1898 - val_PSNR: 28.1774
Epoch 6/100
200/200 [==============================] - 5s 25ms/step - loss: 10.5614 - PSNR: 26.7754 - val_loss: 10.7009 - val_PSNR: 27.1563
Epoch 7/100
200/200 [==============================] - 5s 27ms/step - loss: 10.3039 - PSNR: 27.4743 - val_loss: 9.1951 - val_PSNR: 25.2586
Epoch 8/100
200/200 [==============================] - 5s 25ms/step - loss: 9.7396 - PSNR: 27.8913 - val_loss: 9.0293 - val_PSNR: 27.6261
Epoch 9/100
200/200 [==============================] - 5s 23ms/step - loss: 9.6408 - PSNR: 28.3736 - val_loss: 9.2961 - val_PSNR: 28.0373
Epoch 10/100
200/200 [==============================] - 5s 24ms/step - loss: 9.2471 - PSNR: 28.3628 - val_loss: 9.3442 - val_PSNR: 26.8069
Epoch 11/100
200/200 [==============================] - 5s 23ms/step - loss: 9.3253 - PSNR: 28.4969 - val_loss: 9.6147 - val_PSNR: 26.9690
Epoch 12/100
200/200 [==============================] - 5s 27ms/step - loss: 9.2662 - PSNR: 28.6225 - val_loss: 8.8186 - val_PSNR: 28.8758
Epoch 13/100
200/200 [==============================] - 5s 24ms/step - loss: 8.8713 - PSNR: 29.4019 - val_loss: 9.3781 - val_PSNR: 30.6475
Epoch 14/100
200/200 [==============================] - 6s 28ms/step - loss: 8.8405 - PSNR: 28.5618 - val_loss: 9.0801 - val_PSNR: 30.6040
Epoch 15/100
200/200 [==============================] - 5s 23ms/step - loss: 8.7410 - PSNR: 29.4704 - val_loss: 8.4210 - val_PSNR: 27.8920
Epoch 16/100
200/200 [==============================] - 5s 25ms/step - loss: 8.7558 - PSNR: 29.3513 - val_loss: 8.9297 - val_PSNR: 30.8252
Epoch 17/100
200/200 [==============================] - 5s 23ms/step - loss: 8.5389 - PSNR: 29.7029 - val_loss: 8.0594 - val_PSNR: 25.2917
Epoch 18/100
200/200 [==============================] - 5s 25ms/step - loss: 8.5658 - PSNR: 30.1905 - val_loss: 9.2223 - val_PSNR: 28.1246
Epoch 19/100
200/200 [==============================] - 5s 24ms/step - loss: 8.3085 - PSNR: 29.4969 - val_loss: 8.9184 - val_PSNR: 30.2876
Epoch 20/100
200/200 [==============================] - 5s 23ms/step - loss: 8.4759 - PSNR: 29.7793 - val_loss: 8.8368 - val_PSNR: 26.6592
Epoch 21/100
200/200 [==============================] - 6s 29ms/step - loss: 8.3387 - PSNR: 29.9120 - val_loss: 8.2056 - val_PSNR: 29.5622
Epoch 22/100
200/200 [==============================] - 5s 23ms/step - loss: 8.3193 - PSNR: 30.1434 - val_loss: 7.8769 - val_PSNR: 30.2302
Epoch 23/100
200/200 [==============================] - 5s 25ms/step - loss: 8.3325 - PSNR: 30.2311 - val_loss: 8.5879 - val_PSNR: 35.9409
Epoch 24/100
200/200 [==============================] - 5s 25ms/step - loss: 8.3677 - PSNR: 29.5960 - val_loss: 8.7572 - val_PSNR: 24.9488
Epoch 25/100
200/200 [==============================] - 5s 24ms/step - loss: 8.1480 - PSNR: 30.2716 - val_loss: 7.6321 - val_PSNR: 25.0161
Epoch 26/100
200/200 [==============================] - 5s 23ms/step - loss: 8.0077 - PSNR: 30.5484 - val_loss: 7.9876 - val_PSNR: 32.6433
Epoch 27/100
200/200 [==============================] - 5s 25ms/step - loss: 8.0438 - PSNR: 31.7881 - val_loss: 8.2335 - val_PSNR: 27.0430
Epoch 28/100
200/200 [==============================] - 5s 27ms/step - loss: 7.8027 - PSNR: 31.0810 - val_loss: 7.5502 - val_PSNR: 33.6384
Epoch 29/100
200/200 [==============================] - 5s 23ms/step - loss: 8.0903 - PSNR: 30.5825 - val_loss: 7.9668 - val_PSNR: 27.5098
Epoch 30/100
200/200 [==============================] - 5s 24ms/step - loss: 8.0406 - PSNR: 31.1930 - val_loss: 7.7136 - val_PSNR: 30.2787
Epoch 31/100
200/200 [==============================] - 5s 25ms/step - loss: 7.8950 - PSNR: 31.4787 - val_loss: 7.9930 - val_PSNR: 31.5450
Epoch 32/100
200/200 [==============================] - 5s 24ms/step - loss: 8.0342 - PSNR: 30.4791 - val_loss: 7.9564 - val_PSNR: 34.2960
Epoch 33/100
200/200 [==============================] - 5s 23ms/step - loss: 7.9482 - PSNR: 30.8209 - val_loss: 8.4953 - val_PSNR: 28.0086
Epoch 34/100
200/200 [==============================] - 5s 27ms/step - loss: 7.8354 - PSNR: 30.7795 - val_loss: 7.3558 - val_PSNR: 29.4524
Epoch 35/100
200/200 [==============================] - 5s 25ms/step - loss: 7.7739 - PSNR: 31.1586 - val_loss: 7.4183 - val_PSNR: 33.7560
Epoch 36/100
200/200 [==============================] - 5s 24ms/step - loss: 8.0502 - PSNR: 31.0083 - val_loss: 8.4974 - val_PSNR: 29.2749
Epoch 37/100
200/200 [==============================] - 5s 25ms/step - loss: 8.0206 - PSNR: 30.3373 - val_loss: 7.4339 - val_PSNR: 31.5273
Epoch 38/100
200/200 [==============================] - 5s 23ms/step - loss: 7.8774 - PSNR: 31.2036 - val_loss: 7.4696 - val_PSNR: 30.2171
Epoch 39/100
200/200 [==============================] - 5s 24ms/step - loss: 7.8726 - PSNR: 31.2640 - val_loss: 7.9460 - val_PSNR: 35.0690
Epoch 40/100
200/200 [==============================] - 5s 23ms/step - loss: 7.8262 - PSNR: 30.4447 - val_loss: 7.5625 - val_PSNR: 28.2478
Epoch 41/100
200/200 [==============================] - 5s 27ms/step - loss: 7.8190 - PSNR: 31.8279 - val_loss: 7.9609 - val_PSNR: 28.3604
Epoch 42/100
200/200 [==============================] - 5s 24ms/step - loss: 7.9076 - PSNR: 30.9561 - val_loss: 7.6669 - val_PSNR: 28.5978
Epoch 43/100
200/200 [==============================] - 5s 25ms/step - loss: 7.6469 - PSNR: 29.6370 - val_loss: 8.2193 - val_PSNR: 31.6554
Epoch 44/100
200/200 [==============================] - 5s 24ms/step - loss: 7.7019 - PSNR: 30.7646 - val_loss: 7.7329 - val_PSNR: 31.9847
Epoch 45/100
200/200 [==============================] - 5s 24ms/step - loss: 7.7013 - PSNR: 30.7467 - val_loss: 7.8213 - val_PSNR: 34.6631
Epoch 46/100
200/200 [==============================] - 5s 23ms/step - loss: 7.6997 - PSNR: 31.0849 - val_loss: 8.4952 - val_PSNR: 30.4933
Epoch 47/100
200/200 [==============================] - 5s 23ms/step - loss: 7.7202 - PSNR: 30.5264 - val_loss: 7.8633 - val_PSNR: 27.2873
Epoch 48/100
200/200 [==============================] - 5s 27ms/step - loss: 7.6953 - PSNR: 30.8625 - val_loss: 7.6338 - val_PSNR: 28.6837
Epoch 49/100
200/200 [==============================] - 5s 24ms/step - loss: 7.7594 - PSNR: 31.1755 - val_loss: 7.1813 - val_PSNR: 26.9279
Epoch 50/100
200/200 [==============================] - 5s 27ms/step - loss: 7.6261 - PSNR: 31.1761 - val_loss: 7.7876 - val_PSNR: 27.3341
Epoch 51/100
200/200 [==============================] - 5s 23ms/step - loss: 7.7745 - PSNR: 30.8263 - val_loss: 7.5156 - val_PSNR: 32.3333
Epoch 52/100
200/200 [==============================] - 5s 24ms/step - loss: 7.6008 - PSNR: 31.7587 - val_loss: 8.0623 - val_PSNR: 32.0190
Epoch 53/100
200/200 [==============================] - 5s 23ms/step - loss: 7.6266 - PSNR: 31.1185 - val_loss: 8.0691 - val_PSNR: 30.8687
Epoch 54/100
200/200 [==============================] - 5s 24ms/step - loss: 7.5881 - PSNR: 31.1294 - val_loss: 8.8379 - val_PSNR: 27.1279
Epoch 55/100
200/200 [==============================] - 5s 27ms/step - loss: 7.7102 - PSNR: 31.0702 - val_loss: 8.3599 - val_PSNR: 31.4379
Epoch 56/100
200/200 [==============================] - 5s 25ms/step - loss: 7.7030 - PSNR: 31.2776 - val_loss: 7.0821 - val_PSNR: 28.9307
Epoch 57/100
200/200 [==============================] - 5s 24ms/step - loss: 7.6365 - PSNR: 30.9228 - val_loss: 7.5395 - val_PSNR: 25.8052
Epoch 58/100
200/200 [==============================] - 5s 23ms/step - loss: 7.7216 - PSNR: 30.8950 - val_loss: 7.2971 - val_PSNR: 29.5786
Epoch 59/100
200/200 [==============================] - 5s 24ms/step - loss: 7.7276 - PSNR: 31.2180 - val_loss: 6.8781 - val_PSNR: 30.0098
Epoch 60/100
200/200 [==============================] - 5s 23ms/step - loss: 7.5367 - PSNR: 32.2129 - val_loss: 7.2518 - val_PSNR: 28.1806
Epoch 61/100
200/200 [==============================] - 5s 24ms/step - loss: 7.6314 - PSNR: 31.2292 - val_loss: 7.8419 - val_PSNR: 27.4896
Epoch 62/100
200/200 [==============================] - 5s 27ms/step - loss: 7.6395 - PSNR: 30.7902 - val_loss: 7.0972 - val_PSNR: 32.7862
Epoch 63/100
200/200 [==============================] - 5s 26ms/step - loss: 7.6798 - PSNR: 30.9359 - val_loss: 7.6548 - val_PSNR: 28.8376
Epoch 64/100
200/200 [==============================] - 5s 23ms/step - loss: 7.5944 - PSNR: 32.1995 - val_loss: 7.8510 - val_PSNR: 27.6524
Epoch 65/100
200/200 [==============================] - 5s 23ms/step - loss: 7.6976 - PSNR: 31.3177 - val_loss: 7.3681 - val_PSNR: 26.9632
Epoch 66/100
200/200 [==============================] - 5s 24ms/step - loss: 7.6104 - PSNR: 31.1558 - val_loss: 7.0828 - val_PSNR: 31.2856
Epoch 67/100
200/200 [==============================] - 5s 23ms/step - loss: 7.5658 - PSNR: 31.2788 - val_loss: 7.1656 - val_PSNR: 31.1375
Epoch 68/100
200/200 [==============================] - 5s 24ms/step - loss: 7.5012 - PSNR: 32.1413 - val_loss: 7.3381 - val_PSNR: 31.7715
Epoch 69/100
200/200 [==============================] - 6s 30ms/step - loss: 7.4850 - PSNR: 31.1948 - val_loss: 7.4143 - val_PSNR: 32.4784
Epoch 70/100
200/200 [==============================] - 5s 24ms/step - loss: 7.5641 - PSNR: 32.0359 - val_loss: 7.3835 - val_PSNR: 29.9480
Epoch 71/100
200/200 [==============================] - 5s 23ms/step - loss: 7.5340 - PSNR: 30.9331 - val_loss: 8.1031 - val_PSNR: 29.4671
Epoch 72/100
200/200 [==============================] - 5s 24ms/step - loss: 7.4417 - PSNR: 31.3967 - val_loss: 7.8881 - val_PSNR: 30.7113
Epoch 73/100
200/200 [==============================] - 5s 23ms/step - loss: 7.5486 - PSNR: 31.2077 - val_loss: 7.6450 - val_PSNR: 27.9421
Epoch 74/100
200/200 [==============================] - 5s 23ms/step - loss: 7.3809 - PSNR: 32.1505 - val_loss: 7.2975 - val_PSNR: 33.1598
Epoch 75/100
200/200 [==============================] - 5s 25ms/step - loss: 7.4748 - PSNR: 31.6929 - val_loss: 7.0933 - val_PSNR: 31.7691
Epoch 76/100
200/200 [==============================] - 6s 29ms/step - loss: 7.5850 - PSNR: 31.1756 - val_loss: 7.2140 - val_PSNR: 31.3329
Epoch 77/100
200/200 [==============================] - 5s 24ms/step - loss: 7.5636 - PSNR: 32.0245 - val_loss: 7.4734 - val_PSNR: 31.0657
Epoch 78/100
200/200 [==============================] - 5s 23ms/step - loss: 7.5276 - PSNR: 31.7250 - val_loss: 7.5573 - val_PSNR: 33.5534
Epoch 79/100
200/200 [==============================] - 5s 24ms/step - loss: 7.3947 - PSNR: 30.9446 - val_loss: 7.9267 - val_PSNR: 26.7340
Epoch 80/100
200/200 [==============================] - 5s 23ms/step - loss: 7.4063 - PSNR: 31.5861 - val_loss: 7.4863 - val_PSNR: 31.6447
Epoch 81/100
200/200 [==============================] - 5s 24ms/step - loss: 7.3424 - PSNR: 32.2676 - val_loss: 7.9719 - val_PSNR: 31.9860
Epoch 82/100
200/200 [==============================] - 5s 25ms/step - loss: 7.4569 - PSNR: 31.3147 - val_loss: 7.6705 - val_PSNR: 33.2949
Epoch 83/100
200/200 [==============================] - 6s 28ms/step - loss: 7.5565 - PSNR: 31.1333 - val_loss: 7.4213 - val_PSNR: 33.0158
Epoch 84/100
200/200 [==============================] - 5s 23ms/step - loss: 7.4643 - PSNR: 31.9889 - val_loss: 7.9844 - val_PSNR: 30.7147
Epoch 85/100
200/200 [==============================] - 5s 23ms/step - loss: 7.4063 - PSNR: 31.1166 - val_loss: 7.9987 - val_PSNR: 29.5907
Epoch 86/100
200/200 [==============================] - 5s 24ms/step - loss: 7.3741 - PSNR: 32.2990 - val_loss: 7.6735 - val_PSNR: 30.5529
Epoch 87/100
200/200 [==============================] - 5s 23ms/step - loss: 7.5110 - PSNR: 31.4022 - val_loss: 7.7850 - val_PSNR: 26.9728
Epoch 88/100
200/200 [==============================] - 5s 26ms/step - loss: 7.4729 - PSNR: 32.0911 - val_loss: 7.2887 - val_PSNR: 32.5554
Epoch 89/100
200/200 [==============================] - 5s 24ms/step - loss: 7.3537 - PSNR: 30.9344 - val_loss: 6.1837 - val_PSNR: 30.5251
Epoch 90/100
200/200 [==============================] - 6s 28ms/step - loss: 7.3530 - PSNR: 32.2839 - val_loss: 8.3047 - val_PSNR: 30.0788
Epoch 91/100
200/200 [==============================] - 5s 23ms/step - loss: 7.5528 - PSNR: 31.3978 - val_loss: 7.5201 - val_PSNR: 30.6518
Epoch 92/100
200/200 [==============================] - 5s 24ms/step - loss: 7.2795 - PSNR: 32.0786 - val_loss: 7.6043 - val_PSNR: 28.5367
Epoch 93/100
200/200 [==============================] - 5s 23ms/step - loss: 7.5697 - PSNR: 31.1380 - val_loss: 7.8448 - val_PSNR: 31.1890
Epoch 94/100
200/200 [==============================] - 5s 23ms/step - loss: 7.4769 - PSNR: 31.0369 - val_loss: 7.5932 - val_PSNR: 33.5564
Epoch 95/100
200/200 [==============================] - 5s 27ms/step - loss: 7.3321 - PSNR: 32.3263 - val_loss: 7.8854 - val_PSNR: 29.4621
Epoch 96/100
200/200 [==============================] - 5s 25ms/step - loss: 7.3122 - PSNR: 30.4803 - val_loss: 8.2456 - val_PSNR: 34.3260
Epoch 97/100
200/200 [==============================] - 6s 28ms/step - loss: 7.3982 - PSNR: 31.3874 - val_loss: 7.9937 - val_PSNR: 29.0465
Epoch 98/100
200/200 [==============================] - 5s 23ms/step - loss: 7.2731 - PSNR: 32.4385 - val_loss: 7.7090 - val_PSNR: 33.3788
Epoch 99/100
200/200 [==============================] - 5s 24ms/step - loss: 7.2461 - PSNR: 31.0295 - val_loss: 7.4717 - val_PSNR: 30.8736
Epoch 100/100
200/200 [==============================] - 5s 23ms/step - loss: 7.2764 - PSNR: 32.3360 - val_loss: 7.2045 - val_PSNR: 32.4493
Training time: 507.0695264339447
Train with Residual Blocks: 16 and Upsample method: SubPixelConv
Model: "edsr_model_5"
__________________________________________________________________________________________________
 Layer (type)                   Output Shape         Param #     Connected to                     
==================================================================================================
 input_6 (InputLayer)           [(None, None, None,  0           []                               
                                 3)]                                                              
                                                                                                  
 rescaling_10 (Rescaling)       (None, None, None,   0           ['input_6[0][0]']                
                                3)                                                                
                                                                                                  
 conv2d_93 (Conv2D)             (None, None, None,   1792        ['rescaling_10[0][0]']           
                                64)                                                               
                                                                                                  
 conv2d_94 (Conv2D)             (None, None, None,   36928       ['conv2d_93[0][0]']              
                                64)                                                               
                                                                                                  
 conv2d_95 (Conv2D)             (None, None, None,   36928       ['conv2d_94[0][0]']              
                                64)                                                               
                                                                                                  
 add_43 (Add)                   (None, None, None,   0           ['conv2d_93[0][0]',              
                                64)                               'conv2d_95[0][0]']              
                                                                                                  
 conv2d_96 (Conv2D)             (None, None, None,   36928       ['add_43[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_97 (Conv2D)             (None, None, None,   36928       ['conv2d_96[0][0]']              
                                64)                                                               
                                                                                                  
 add_44 (Add)                   (None, None, None,   0           ['add_43[0][0]',                 
                                64)                               'conv2d_97[0][0]']              
                                                                                                  
 conv2d_98 (Conv2D)             (None, None, None,   36928       ['add_44[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_99 (Conv2D)             (None, None, None,   36928       ['conv2d_98[0][0]']              
                                64)                                                               
                                                                                                  
 add_45 (Add)                   (None, None, None,   0           ['add_44[0][0]',                 
                                64)                               'conv2d_99[0][0]']              
                                                                                                  
 conv2d_100 (Conv2D)            (None, None, None,   36928       ['add_45[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_101 (Conv2D)            (None, None, None,   36928       ['conv2d_100[0][0]']             
                                64)                                                               
                                                                                                  
 add_46 (Add)                   (None, None, None,   0           ['add_45[0][0]',                 
                                64)                               'conv2d_101[0][0]']             
                                                                                                  
 conv2d_102 (Conv2D)            (None, None, None,   36928       ['add_46[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_103 (Conv2D)            (None, None, None,   36928       ['conv2d_102[0][0]']             
                                64)                                                               
                                                                                                  
 add_47 (Add)                   (None, None, None,   0           ['add_46[0][0]',                 
                                64)                               'conv2d_103[0][0]']             
                                                                                                  
 conv2d_104 (Conv2D)            (None, None, None,   36928       ['add_47[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_105 (Conv2D)            (None, None, None,   36928       ['conv2d_104[0][0]']             
                                64)                                                               
                                                                                                  
 add_48 (Add)                   (None, None, None,   0           ['add_47[0][0]',                 
                                64)                               'conv2d_105[0][0]']             
                                                                                                  
 conv2d_106 (Conv2D)            (None, None, None,   36928       ['add_48[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_107 (Conv2D)            (None, None, None,   36928       ['conv2d_106[0][0]']             
                                64)                                                               
                                                                                                  
 add_49 (Add)                   (None, None, None,   0           ['add_48[0][0]',                 
                                64)                               'conv2d_107[0][0]']             
                                                                                                  
 conv2d_108 (Conv2D)            (None, None, None,   36928       ['add_49[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_109 (Conv2D)            (None, None, None,   36928       ['conv2d_108[0][0]']             
                                64)                                                               
                                                                                                  
 add_50 (Add)                   (None, None, None,   0           ['add_49[0][0]',                 
                                64)                               'conv2d_109[0][0]']             
                                                                                                  
 conv2d_110 (Conv2D)            (None, None, None,   36928       ['add_50[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_111 (Conv2D)            (None, None, None,   36928       ['conv2d_110[0][0]']             
                                64)                                                               
                                                                                                  
 add_51 (Add)                   (None, None, None,   0           ['add_50[0][0]',                 
                                64)                               'conv2d_111[0][0]']             
                                                                                                  
 conv2d_112 (Conv2D)            (None, None, None,   36928       ['add_51[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_113 (Conv2D)            (None, None, None,   36928       ['conv2d_112[0][0]']             
                                64)                                                               
                                                                                                  
 add_52 (Add)                   (None, None, None,   0           ['add_51[0][0]',                 
                                64)                               'conv2d_113[0][0]']             
                                                                                                  
 conv2d_114 (Conv2D)            (None, None, None,   36928       ['add_52[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_115 (Conv2D)            (None, None, None,   36928       ['conv2d_114[0][0]']             
                                64)                                                               
                                                                                                  
 add_53 (Add)                   (None, None, None,   0           ['add_52[0][0]',                 
                                64)                               'conv2d_115[0][0]']             
                                                                                                  
 conv2d_116 (Conv2D)            (None, None, None,   36928       ['add_53[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_117 (Conv2D)            (None, None, None,   36928       ['conv2d_116[0][0]']             
                                64)                                                               
                                                                                                  
 add_54 (Add)                   (None, None, None,   0           ['add_53[0][0]',                 
                                64)                               'conv2d_117[0][0]']             
                                                                                                  
 conv2d_118 (Conv2D)            (None, None, None,   36928       ['add_54[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_119 (Conv2D)            (None, None, None,   36928       ['conv2d_118[0][0]']             
                                64)                                                               
                                                                                                  
 add_55 (Add)                   (None, None, None,   0           ['add_54[0][0]',                 
                                64)                               'conv2d_119[0][0]']             
                                                                                                  
 conv2d_120 (Conv2D)            (None, None, None,   36928       ['add_55[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_121 (Conv2D)            (None, None, None,   36928       ['conv2d_120[0][0]']             
                                64)                                                               
                                                                                                  
 add_56 (Add)                   (None, None, None,   0           ['add_55[0][0]',                 
                                64)                               'conv2d_121[0][0]']             
                                                                                                  
 conv2d_122 (Conv2D)            (None, None, None,   36928       ['add_56[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_123 (Conv2D)            (None, None, None,   36928       ['conv2d_122[0][0]']             
                                64)                                                               
                                                                                                  
 add_57 (Add)                   (None, None, None,   0           ['add_56[0][0]',                 
                                64)                               'conv2d_123[0][0]']             
                                                                                                  
 conv2d_124 (Conv2D)            (None, None, None,   36928       ['add_57[0][0]']                 
                                64)                                                               
                                                                                                  
 conv2d_125 (Conv2D)            (None, None, None,   36928       ['conv2d_124[0][0]']             
                                64)                                                               
                                                                                                  
 add_58 (Add)                   (None, None, None,   0           ['add_57[0][0]',                 
                                64)                               'conv2d_125[0][0]']             
                                                                                                  
 conv2d_126 (Conv2D)            (None, None, None,   36928       ['add_58[0][0]']                 
                                64)                                                               
                                                                                                  
 add_59 (Add)                   (None, None, None,   0           ['conv2d_93[0][0]',              
                                64)                               'conv2d_126[0][0]']             
                                                                                                  
 conv2d_127 (Conv2D)            (None, None, None,   147712      ['add_59[0][0]']                 
                                256)                                                              
                                                                                                  
 tf.nn.depth_to_space_2 (TFOpLa  (None, None, None,   0          ['conv2d_127[0][0]']             
 mbda)                          64)                                                               
                                                                                                  
 conv2d_128 (Conv2D)            (None, None, None,   147712      ['tf.nn.depth_to_space_2[0][0]'] 
                                256)                                                              
                                                                                                  
 tf.nn.depth_to_space_3 (TFOpLa  (None, None, None,   0          ['conv2d_128[0][0]']             
 mbda)                          64)                                                               
                                                                                                  
 conv2d_129 (Conv2D)            (None, None, None,   1731        ['tf.nn.depth_to_space_3[0][0]'] 
                                3)                                                                
                                                                                                  
 rescaling_11 (Rescaling)       (None, None, None,   0           ['conv2d_129[0][0]']             
                                3)                                                                
                                                                                                  
==================================================================================================
Total params: 1,517,571
Trainable params: 1,517,571
Non-trainable params: 0
__________________________________________________________________________________________________
Epoch 1/100
200/200 [==============================] - 18s 35ms/step - loss: 25.2333 - PSNR: 20.3749 - val_loss: 13.6808 - val_PSNR: 22.6089
Epoch 2/100
200/200 [==============================] - 6s 30ms/step - loss: 12.4858 - PSNR: 26.2728 - val_loss: 11.4983 - val_PSNR: 24.6409
Epoch 3/100
200/200 [==============================] - 5s 27ms/step - loss: 10.4668 - PSNR: 26.8301 - val_loss: 9.6027 - val_PSNR: 26.8766
Epoch 4/100
200/200 [==============================] - 6s 28ms/step - loss: 9.8721 - PSNR: 27.7045 - val_loss: 8.8500 - val_PSNR: 27.8287
Epoch 5/100
200/200 [==============================] - 5s 27ms/step - loss: 9.1341 - PSNR: 28.5026 - val_loss: 7.9045 - val_PSNR: 28.2596
Epoch 6/100
200/200 [==============================] - 6s 28ms/step - loss: 8.7897 - PSNR: 29.8974 - val_loss: 9.4292 - val_PSNR: 24.8986
Epoch 7/100
200/200 [==============================] - 6s 32ms/step - loss: 9.0199 - PSNR: 29.1673 - val_loss: 8.7044 - val_PSNR: 29.7241
Epoch 8/100
200/200 [==============================] - 6s 28ms/step - loss: 8.4820 - PSNR: 29.7034 - val_loss: 8.8417 - val_PSNR: 31.3544
Epoch 9/100
200/200 [==============================] - 5s 26ms/step - loss: 8.4316 - PSNR: 29.8078 - val_loss: 8.5900 - val_PSNR: 28.9421
Epoch 10/100
200/200 [==============================] - 5s 27ms/step - loss: 8.2946 - PSNR: 30.0365 - val_loss: 7.4339 - val_PSNR: 29.1136
Epoch 11/100
200/200 [==============================] - 5s 26ms/step - loss: 8.4228 - PSNR: 30.0249 - val_loss: 8.6601 - val_PSNR: 30.1600
Epoch 12/100
200/200 [==============================] - 5s 27ms/step - loss: 8.1963 - PSNR: 30.1632 - val_loss: 8.3386 - val_PSNR: 30.6601
Epoch 13/100
200/200 [==============================] - 7s 34ms/step - loss: 8.1001 - PSNR: 30.4627 - val_loss: 7.7866 - val_PSNR: 33.2122
Epoch 14/100
200/200 [==============================] - 5s 27ms/step - loss: 7.9469 - PSNR: 29.9507 - val_loss: 7.9877 - val_PSNR: 32.2222
Epoch 15/100
200/200 [==============================] - 5s 26ms/step - loss: 7.7279 - PSNR: 31.8348 - val_loss: 7.7731 - val_PSNR: 31.0497
Epoch 16/100
200/200 [==============================] - 5s 27ms/step - loss: 7.9677 - PSNR: 30.2106 - val_loss: 7.7994 - val_PSNR: 31.3827
Epoch 17/100
200/200 [==============================] - 5s 26ms/step - loss: 7.8931 - PSNR: 30.7313 - val_loss: 7.9766 - val_PSNR: 29.5453
Epoch 18/100
200/200 [==============================] - 5s 27ms/step - loss: 7.6843 - PSNR: 31.3412 - val_loss: 8.2510 - val_PSNR: 28.6770
Epoch 19/100
200/200 [==============================] - 6s 31ms/step - loss: 7.8055 - PSNR: 30.4235 - val_loss: 7.8415 - val_PSNR: 29.1318
Epoch 20/100
200/200 [==============================] - 6s 28ms/step - loss: 7.7386 - PSNR: 31.0232 - val_loss: 7.8105 - val_PSNR: 30.2151
Epoch 21/100
200/200 [==============================] - 5s 26ms/step - loss: 7.6195 - PSNR: 31.1602 - val_loss: 7.7592 - val_PSNR: 30.3119
Epoch 22/100
200/200 [==============================] - 5s 27ms/step - loss: 7.6426 - PSNR: 31.2932 - val_loss: 7.1461 - val_PSNR: 32.0227
Epoch 23/100
200/200 [==============================] - 5s 26ms/step - loss: 7.6226 - PSNR: 31.0163 - val_loss: 8.0579 - val_PSNR: 30.1762
Epoch 24/100
200/200 [==============================] - 6s 29ms/step - loss: 7.6480 - PSNR: 30.6599 - val_loss: 8.0934 - val_PSNR: 29.8697
Epoch 25/100
200/200 [==============================] - 6s 28ms/step - loss: 7.5750 - PSNR: 31.1081 - val_loss: 7.9514 - val_PSNR: 32.9799
Epoch 26/100
200/200 [==============================] - 6s 29ms/step - loss: 7.3917 - PSNR: 31.6575 - val_loss: 7.6364 - val_PSNR: 26.6087
Epoch 27/100
200/200 [==============================] - 5s 26ms/step - loss: 7.3176 - PSNR: 32.3013 - val_loss: 7.6633 - val_PSNR: 31.3405
Epoch 28/100
200/200 [==============================] - 5s 27ms/step - loss: 7.3311 - PSNR: 31.6139 - val_loss: 7.5953 - val_PSNR: 32.7543
Epoch 29/100
200/200 [==============================] - 5s 26ms/step - loss: 7.3638 - PSNR: 32.3780 - val_loss: 8.3250 - val_PSNR: 28.6227
Epoch 30/100
200/200 [==============================] - 6s 29ms/step - loss: 7.3927 - PSNR: 31.6756 - val_loss: 7.3233 - val_PSNR: 31.7045
Epoch 31/100
200/200 [==============================] - 5s 26ms/step - loss: 7.4323 - PSNR: 31.7463 - val_loss: 7.6066 - val_PSNR: 28.5696
Epoch 32/100
200/200 [==============================] - 6s 31ms/step - loss: 7.4299 - PSNR: 32.2868 - val_loss: 7.1006 - val_PSNR: 33.0525
Epoch 33/100
200/200 [==============================] - 5s 26ms/step - loss: 7.2396 - PSNR: 31.2076 - val_loss: 6.3351 - val_PSNR: 28.5790
Epoch 34/100
200/200 [==============================] - 6s 28ms/step - loss: 7.3123 - PSNR: 32.0217 - val_loss: 6.3687 - val_PSNR: 36.0653
Epoch 35/100
200/200 [==============================] - 5s 26ms/step - loss: 7.2346 - PSNR: 32.4096 - val_loss: 8.0430 - val_PSNR: 28.2147
Epoch 36/100
200/200 [==============================] - 6s 28ms/step - loss: 7.2360 - PSNR: 32.3641 - val_loss: 7.6573 - val_PSNR: 32.1628
Epoch 37/100
200/200 [==============================] - 5s 26ms/step - loss: 7.2239 - PSNR: 32.5704 - val_loss: 8.0844 - val_PSNR: 28.6159
Epoch 38/100
200/200 [==============================] - 6s 31ms/step - loss: 7.2970 - PSNR: 31.6224 - val_loss: 7.2920 - val_PSNR: 31.0073
Epoch 39/100
200/200 [==============================] - 5s 27ms/step - loss: 7.2181 - PSNR: 32.5179 - val_loss: 7.6805 - val_PSNR: 30.7279
Epoch 40/100
200/200 [==============================] - 5s 27ms/step - loss: 7.2278 - PSNR: 31.5681 - val_loss: 7.6034 - val_PSNR: 30.0579
Epoch 41/100
200/200 [==============================] - 6s 29ms/step - loss: 7.2963 - PSNR: 32.1550 - val_loss: 7.4417 - val_PSNR: 36.0715
Epoch 42/100
200/200 [==============================] - 5s 27ms/step - loss: 7.2363 - PSNR: 32.9067 - val_loss: 7.4447 - val_PSNR: 32.4258
Epoch 43/100
200/200 [==============================] - 5s 26ms/step - loss: 7.5023 - PSNR: 30.8614 - val_loss: 7.8812 - val_PSNR: 30.4203
Epoch 44/100
200/200 [==============================] - 6s 30ms/step - loss: 7.2480 - PSNR: 32.0459 - val_loss: 6.9023 - val_PSNR: 35.5986
Epoch 45/100
200/200 [==============================] - 6s 28ms/step - loss: 7.1397 - PSNR: 32.3037 - val_loss: 7.3573 - val_PSNR: 33.6363
Epoch 46/100
200/200 [==============================] - 5s 27ms/step - loss: 7.3287 - PSNR: 31.8207 - val_loss: 7.4366 - val_PSNR: 34.5691
Epoch 47/100
200/200 [==============================] - 6s 28ms/step - loss: 7.0634 - PSNR: 32.3188 - val_loss: 7.9104 - val_PSNR: 28.6757
Epoch 48/100
200/200 [==============================] - 5s 27ms/step - loss: 7.1986 - PSNR: 31.3927 - val_loss: 7.7528 - val_PSNR: 27.1336
Epoch 49/100
200/200 [==============================] - 5s 26ms/step - loss: 7.1643 - PSNR: 33.4114 - val_loss: 7.5648 - val_PSNR: 30.0425
Epoch 50/100
200/200 [==============================] - 6s 28ms/step - loss: 7.1890 - PSNR: 32.2585 - val_loss: 7.2326 - val_PSNR: 28.7174
Epoch 51/100
200/200 [==============================] - 6s 29ms/step - loss: 7.2189 - PSNR: 31.6128 - val_loss: 7.5821 - val_PSNR: 30.2950
Epoch 52/100
200/200 [==============================] - 5s 26ms/step - loss: 7.1684 - PSNR: 32.0145 - val_loss: 6.9536 - val_PSNR: 27.9504
Epoch 53/100
200/200 [==============================] - 6s 28ms/step - loss: 7.1453 - PSNR: 33.0041 - val_loss: 6.7535 - val_PSNR: 31.2205
Epoch 54/100
200/200 [==============================] - 5s 27ms/step - loss: 7.0614 - PSNR: 32.4300 - val_loss: 8.1967 - val_PSNR: 27.0083
Epoch 55/100
200/200 [==============================] - 5s 26ms/step - loss: 7.2219 - PSNR: 31.7131 - val_loss: 7.4755 - val_PSNR: 32.3220
Epoch 56/100
200/200 [==============================] - 5s 27ms/step - loss: 7.1124 - PSNR: 33.0947 - val_loss: 6.6832 - val_PSNR: 35.5691
Epoch 57/100
200/200 [==============================] - 6s 30ms/step - loss: 7.1712 - PSNR: 31.9047 - val_loss: 6.6813 - val_PSNR: 31.9253
Epoch 58/100
200/200 [==============================] - 6s 29ms/step - loss: 7.0599 - PSNR: 32.6826 - val_loss: 7.4546 - val_PSNR: 31.1693
Epoch 59/100
200/200 [==============================] - 5s 26ms/step - loss: 7.1098 - PSNR: 32.1502 - val_loss: 7.2395 - val_PSNR: 35.3842
Epoch 60/100
200/200 [==============================] - 5s 27ms/step - loss: 7.2238 - PSNR: 32.2158 - val_loss: 6.4755 - val_PSNR: 30.3727
Epoch 61/100
200/200 [==============================] - 5s 26ms/step - loss: 7.3387 - PSNR: 31.5687 - val_loss: 6.8755 - val_PSNR: 31.8305
Epoch 62/100
200/200 [==============================] - 5s 27ms/step - loss: 7.1190 - PSNR: 32.0613 - val_loss: 6.7420 - val_PSNR: 33.0621
Epoch 63/100
200/200 [==============================] - 6s 28ms/step - loss: 6.9937 - PSNR: 32.1875 - val_loss: 7.9939 - val_PSNR: 28.1862
Epoch 64/100
200/200 [==============================] - 6s 31ms/step - loss: 7.0225 - PSNR: 32.9399 - val_loss: 6.7883 - val_PSNR: 34.5441
Epoch 65/100
200/200 [==============================] - 5s 26ms/step - loss: 7.0939 - PSNR: 32.5201 - val_loss: 6.7222 - val_PSNR: 32.8601
Epoch 66/100
200/200 [==============================] - 5s 27ms/step - loss: 6.9875 - PSNR: 31.3872 - val_loss: 7.0665 - val_PSNR: 35.6490
Epoch 67/100
200/200 [==============================] - 5s 26ms/step - loss: 6.9182 - PSNR: 32.5588 - val_loss: 7.2270 - val_PSNR: 31.5646
Epoch 68/100
200/200 [==============================] - 6s 28ms/step - loss: 6.9844 - PSNR: 31.5089 - val_loss: 7.1639 - val_PSNR: 31.7423
Epoch 69/100
200/200 [==============================] - 5s 27ms/step - loss: 6.9884 - PSNR: 32.0169 - val_loss: 6.8075 - val_PSNR: 28.4672
Epoch 70/100
200/200 [==============================] - 7s 35ms/step - loss: 7.1175 - PSNR: 32.9970 - val_loss: 7.1848 - val_PSNR: 36.3859
Epoch 71/100
200/200 [==============================] - 5s 26ms/step - loss: 7.1289 - PSNR: 32.1638 - val_loss: 7.6893 - val_PSNR: 31.2196
Epoch 72/100
200/200 [==============================] - 5s 27ms/step - loss: 7.1117 - PSNR: 31.4547 - val_loss: 7.1194 - val_PSNR: 34.6924
Epoch 73/100
200/200 [==============================] - 5s 26ms/step - loss: 7.0666 - PSNR: 32.7448 - val_loss: 7.3615 - val_PSNR: 30.4052
Epoch 74/100
200/200 [==============================] - 5s 27ms/step - loss: 6.9327 - PSNR: 32.7398 - val_loss: 6.1811 - val_PSNR: 33.8368
Epoch 75/100
200/200 [==============================] - 5s 26ms/step - loss: 6.9319 - PSNR: 32.1695 - val_loss: 7.3398 - val_PSNR: 29.8306
Epoch 76/100
200/200 [==============================] - 6s 32ms/step - loss: 7.1276 - PSNR: 32.0655 - val_loss: 7.2084 - val_PSNR: 32.9750
Epoch 77/100
200/200 [==============================] - 5s 26ms/step - loss: 7.1195 - PSNR: 31.4600 - val_loss: 6.9514 - val_PSNR: 30.3558
Epoch 78/100
200/200 [==============================] - 5s 27ms/step - loss: 7.1253 - PSNR: 31.7375 - val_loss: 7.0992 - val_PSNR: 28.9164
Epoch 79/100
200/200 [==============================] - 5s 26ms/step - loss: 6.9564 - PSNR: 32.3520 - val_loss: 7.6750 - val_PSNR: 28.9759
Epoch 80/100
200/200 [==============================] - 5s 27ms/step - loss: 6.8972 - PSNR: 32.9621 - val_loss: 7.1400 - val_PSNR: 28.8804
Epoch 81/100
200/200 [==============================] - 6s 28ms/step - loss: 7.0764 - PSNR: 31.6302 - val_loss: 7.1007 - val_PSNR: 28.2628
Epoch 82/100
200/200 [==============================] - 6s 29ms/step - loss: 6.8769 - PSNR: 32.3613 - val_loss: 6.9629 - val_PSNR: 29.1378
Epoch 83/100
200/200 [==============================] - 5s 27ms/step - loss: 6.9680 - PSNR: 32.5873 - val_loss: 7.0751 - val_PSNR: 33.2482
Epoch 84/100
200/200 [==============================] - 5s 27ms/step - loss: 7.0043 - PSNR: 32.2063 - val_loss: 7.2731 - val_PSNR: 29.3679
Epoch 85/100
200/200 [==============================] - 5s 26ms/step - loss: 6.7534 - PSNR: 33.0820 - val_loss: 6.8243 - val_PSNR: 29.6444
Epoch 86/100
200/200 [==============================] - 6s 28ms/step - loss: 7.0345 - PSNR: 32.5960 - val_loss: 7.1598 - val_PSNR: 36.7871
Epoch 87/100
200/200 [==============================] - 6s 28ms/step - loss: 6.9880 - PSNR: 32.7461 - val_loss: 6.9862 - val_PSNR: 33.1579
Epoch 88/100
200/200 [==============================] - 5s 27ms/step - loss: 6.9516 - PSNR: 31.6473 - val_loss: 7.4859 - val_PSNR: 34.3017
Epoch 89/100
200/200 [==============================] - 6s 30ms/step - loss: 7.0513 - PSNR: 33.6575 - val_loss: 6.8780 - val_PSNR: 33.5961
Epoch 90/100
200/200 [==============================] - 5s 27ms/step - loss: 7.0564 - PSNR: 33.0095 - val_loss: 6.7381 - val_PSNR: 30.2279
Epoch 91/100
200/200 [==============================] - 5s 26ms/step - loss: 7.1222 - PSNR: 32.5882 - val_loss: 6.7333 - val_PSNR: 26.6774
Epoch 92/100
200/200 [==============================] - 6s 28ms/step - loss: 6.9388 - PSNR: 34.0912 - val_loss: 7.5571 - val_PSNR: 30.8489
Epoch 93/100
200/200 [==============================] - 5s 27ms/step - loss: 7.1218 - PSNR: 32.2461 - val_loss: 7.2177 - val_PSNR: 28.8383
Epoch 94/100
200/200 [==============================] - 5s 27ms/step - loss: 7.0351 - PSNR: 32.6331 - val_loss: 7.1227 - val_PSNR: 28.3270
Epoch 95/100
200/200 [==============================] - 6s 29ms/step - loss: 6.9415 - PSNR: 32.2853 - val_loss: 6.8431 - val_PSNR: 32.0888
Epoch 96/100
200/200 [==============================] - 6s 28ms/step - loss: 6.9538 - PSNR: 32.8972 - val_loss: 6.9115 - val_PSNR: 30.3815
Epoch 97/100
200/200 [==============================] - 5s 26ms/step - loss: 7.0913 - PSNR: 31.4599 - val_loss: 6.6948 - val_PSNR: 28.2055
Epoch 98/100
200/200 [==============================] - 6s 29ms/step - loss: 7.1208 - PSNR: 32.1172 - val_loss: 6.5200 - val_PSNR: 34.9918
Epoch 99/100
200/200 [==============================] - 5s 26ms/step - loss: 6.9173 - PSNR: 32.2961 - val_loss: 7.3477 - val_PSNR: 27.4530
Epoch 100/100
200/200 [==============================] - 5s 27ms/step - loss: 7.0332 - PSNR: 32.2218 - val_loss: 6.5501 - val_PSNR: 30.7528
Training time: 567.6723699569702

Conclusion¶

From our expermient we found that increasing the number of Residual Blocks, slightly improves the performance. For Transposed convolution and Sub pixel convolution the PSNR values increased by 1.99 db and 1.74 db respectively. But for Upsampling2D after increasing the number of Residual Blocks, the PSNR value decreased by 0.35 db.

Also we found that Transposed convolution generally gives better result than the default Sub pixel convoluion method. But it takes more time to train. We got the best result for Transposed convolution upsampling with 16 Residual Blocks. For this architechture we obtained a PSNR of 37.94 db.

Interestingly with only 2 residual blocks the simple Upsampling2D method achieved very good results in terms of both time and quality. This model took only 263.31 seconds for training but achieved a PSNR value of 36.29 db.

There is scope for more experiments with varying number of channels and more number of Residual Blocks. Also we implemented only the baseline ESDR model and we can experiment with the EDSR+, MDSR( Multi-Scale super-resolution) and MDSR+ models, which were proposed in the same paper.

Reference:¶

  • https://sh-tsang.medium.com/review-edsr-mdsr-enhanced-deep-residual-networks-for-single-image-super-resolution-super-4364f3b7f86f

  • https://www.analyticsvidhya.com/blog/2021/05/deep-learning-for-image-super-resolution/

  • http://krasserm.github.io/2019/09/04/super-resolution/

  • https://www.youtube.com/watch?v=fMwti6zFcYY&ab_channel=DigitalSreeni

  • https://towardsdatascience.com/types-of-convolutions-in-deep-learning-717013397f4d

In [ ]: